You’ve never lived until you’ve had to download a driver from an archived forum post on the Internet Archive’s Wayback Machine. You have no idea if it’s going to work, but it’s your only option. So you bite the bullet. I recently did this with a PCI-based SATA card I was attempting to flash to support a PowerPC-based Mac, and while it was a bit of a leap of faith, it actually ended up working. Score one for chance. But this, increasingly, feels like it may be a way of life for people trying to keep old hardware alive—despite the fact that all the drivers generally have to do is simply sit on the internet, available when they’re necessary. This problem is only going to get worse as time progresses. We’ll have to hope random people on the internet are kind enough to upload any drivers they’ve collected and held on to over the years, so users of classic hardware can keep them running.
Some Black Friday deals are wild. A store might offer only a couple of units of a particular TV, discounted by 66%. There might be a few pieces of a flagship smartphone at your local electronics store at half price. These are designed to entice customers through the door, and if you’re brave enough, ensure the cold for up to 12 hours to get that bargain of the year. But one of the key observations about looking at Amazon’s Computing and Components section every Black Friday, particularly this year, is that most of the discounts are for complete trash. After the headline external storage discounts, it’s just page after page of USB cables and smartphone holders. But one thing did catch my eye: an entire PC, for only £57/$61! How can an entire x86 desktop PC be sold for so little? We did the only thing worth doing: we purchased it. The listing on Amazon is for a refurbished Dell Optiplex 780 – an office form factor machine that is very typical of one you might see in an office that hasn’t been updated yet (this is probably where this unit came from). The listing for the machine promises a few things: a CPU at 2.6 GHz, 4 GB of DDR3, a 160 GB HDD, and 802.11abg Wi-Fi, as well as Windows 10. What we received was a 2.93 GHz processor (woohoo!), 2×2 GB of DDR3, a 250 Gb HDD (woohoo!), no Wi-Fi (boo), and a full copy of Windows 10. The fact that this comes will a full blown copy of Windows 10 Pro, which even at its cheapest is around $20, astounds me. Even if the whole unit is a refurb, that’s the one part that is most likely new: and given that the value of the contents are around $30, that only leaves $10 for the actual hardware. Better these old office refurbs get sold on Amazon than dumped on a landfill or torn apart by children inhaling toxic fumes in India. These kinds of machines are great for alternative operating systems like Haiku, too.
A U.S.-based foundation overseeing promising semiconductor technology developed with Pentagon support will soon move to Switzerland after several of the group’s foreign members raised concerns about potential U.S. trade curbs. The nonprofit RISC-V Foundation wants to ensure that universities, governments and companies outside the United States can help develop its open-source technology, its Chief Executive Calista Redmond said in an interview with Reuters. Can’t blame them.
The Sholes and Glidden typewriter (sometimes called the Remington No. 1) was the first successful typewriter ever brought to market (in 1873), and the forerunner of most other successful typewriters. The unidentified key was, as far as I can tell, on this model and only this model. It was gone on the Remington No. 2 introduced in 1878, never to appear again (in this form), and as far as I know never found on competitors either. So what the heck is it? I love stuff like this.
Our ability to continuously shrink the features of our silicon-based processors appears to be a thing of the past, which has materials scientists considering ways to move beyond silicon. The top candidate is the carbon nanotube, which naturally comes in semiconducting forms, has fantastic electrical properties, and is extremely small. Unfortunately, it has proven extremely hard to grow the nanotubes where they’re needed and just as difficult to manipulate them to place them in the right location. There has been some progress in working around these challenges, but the results have typically been shown in rather limited demonstrations. Now, researchers have used carbon nanotubes to make a general purpose, RISC-V-compliant processor that handles 32-bit instructions and does 16-bit memory addressing. Performance is nothing to write home about, but the processor successfully executed a variation of the traditional programming demo, “Hello world!” It’s an impressive bit of work, but not all of the researchers’ solutions are likely to lead to high-performance processors. The rate of progress on this particular technology is astounding.
We recently restored an Apollo Guidance Computer, the revolutionary computer that helped navigate to the Moon and land on its surface. At a time when most computers filled rooms, the Apollo Guidance Computer (AGC) took up just a cubic foot. This blog post discusses the small but complex switching power supplies that helped make the AGC compact enough to fit onboard the spacecraft. The Apollo project is one of the greatest scientific and engineering achievements in human history, and apparently that goes down to the details. Amazing.
Standard Telephone & Cable made quite a few phones for British Telecom in the 70s/80s that most people will recognise instantly even though they didn’t actually know who made them. Probably like me they thought that BT made all their own stuff which I later found out was completely wrong but hey. In the early 80s they branched out into computerised telephones with this lovely looking beast, the Executel 3910. Fellow collector Tony brought this one to my attention and on seeing the pictures I said ‘what the hells is THAT!’ and bought it. It’s a desk phone, pure and simple, but massively computerised with an AMD8085 processor and 32K RAM plus a 5″ monitor for displaying diary and phonebook entries AND, and it’s a big AND, PRESTEL access! A recent video by Techmoan – who bought a working model – brought this device to my attention, and I instantly fell in love with it. This is an incredible piece of engineering and forward-thinking.
To many, the (UEFI-based) boot process is like voodoo; interesting in that it’s something that most of us use extensively but is – in a technical-understanding sense – generally avoided by all but those that work in this space. In this article, I hope to present a technical overview of how modern PCs boot using UEFI (Unified Extensible Firmware Interface). I won’t be mentioning every detail – honestly my knowledge in this space isn’t fully comprehensive (and hence the impetus for this article-as-a-primer). A rather detailed overview of the UEFI boot process.
Then they heard about a working model of the ELEA 9003, Olivetti’s first commercial mainframe, introduced in 1959. They lost no time tracking it down. This 9003 had originally belonged to a bank in Siena, where it was used for payroll, managing accounts, calculating interest rates, and the like. In 1972, the bank donated the computer to a high school in the Tuscan hill town of Bibbiena. And there it’s been ever since. Today, former Olivetti employees periodically travel to the ISIS High School Enrico Fermi to tend to the machine. A unique piece of computing history that must be saved at all costs.
I had spent some time several years ago trying to get Linux running on this machine via the (defunct) JLime project, so I had some of the pieces available to actually get this little “pocket computer” going again – mainly compatible CompactFlash cards and an external card reader. But I was mostly joking. Then I starting thinking how funny it would be to actually sit in a talk and take notes at DEF CON on an ancient “laptop”… These things are a thing of beauty.
Long obsolete and not just a museum piece, an early massive computer developed 60 years ago remains working, thanks to a technician dedicated to preserving it for future generations. Tadao Hamada believes that keeping the historic FACOM128B operational will help hand down Japan’s technological heritage to posterity. “I will maintain it forever,” said Hamada, 49. The importance of the work done by people like Tadao Hamada cannot be understated. A lot of technology from the ’40s, ’50s, and ’60s is getting ever more obscure, and as their original designers, maintainers, and users die of old age, we need some way to document their knowledge and pass it on so that we can preserve the technology for posterity. Hamada went one step further, and actually had to teach himself how the system and its operating system worked, since there was nobody around to teach him. That’s some serious dedication, and I applaud both him, and Fujitsu who set up the project to preserve technology.
When my brother’s old 1980s 5″ black and white TV was recently discovered during a “I wonder what’s under here?” exercise and amazingly seemed to still be working my first thought was, of course, “Nice!! 3rd monitor for my PC”. I knew that wouldn’t be exactly simple as the TV only appeared to have a 3.5mm “EXT. ANT” socket. …I can’t do anything but applaud this.
Classic USB from the 1.1, 2.0, to 3.0 generations using USB-A and USB-B connectors have a really nice property in that cables were directional and plugs and receptacles were physically distinct to specify a different capability. A USB 3.0 capable USB-B plug was physically larger than a 2.0 plug and would not fit into a USB 2.0-only receptacle. For the end user, this meant that as long as they have a cable that would physically connect to both the host and the device, the system would function properly, as there is only ever one kind of cable that goes from one A plug to a particular flavor of B plug. Does the same hold for USB-C? We all know the answer to this mess.
The RISC-V Foundation, a non-profit corporation controlled by its members to drive the adoption and implementation of the free and open RISC-V instruction set architecture (ISA), today announced the ratification of the RISC-V base ISA and privileged architecture specifications. The RISC-V base architecture is the interface between application software and hardware. Software that’s coded to this specification will continue to work on RISC-V processors in perpetuity, even as the architecture evolves through the development of new extensions.
The Video Electronics Standards Association today announced that it has released version 2.0 of the DisplayPort audio/video standard. DP 2.0 is the first major update to the DisplayPort standard since March 2016, and provides up to a 3X increase in data bandwidth performance compared to the previous version of DisplayPort (DP 1.4a), as well as new capabilities to address the future performance requirements of traditional displays. These include beyond 8K resolutions, higher refresh rates and high dynamic range (HDR) support at higher resolutions, improved support for multiple display configurations, as well as improved user experience with augmented/virtual reality (AR/VR) displays, including support for 4K-and-beyond VR resolutions. The fact that standards like HDMI and DisplayPort have version numbers all with the same kind of plug always bothered me. It’s not always clear exactly which standards devices support, which can lead to some unfortunate surprises. I wish there was an easier way to figure this sort of stuff out.
We have a surprise for you today: Raspberry Pi 4 is now on sale, starting at $35. This is a comprehensive upgrade, touching almost every element of the platform. For the first time we provide a PC-like level of performance for most users, while retaining the interfacing capabilities and hackability of the classic Raspberry Pi line. The specification bump is quite something, and the pricing is as good as it’s always been. This is a no-brainer buy for me.
It’s a hardware day today, and since AnandTech is the most authoritative source on stuff like this, we’ve got more from them. Arm announced its next big micro-architecture – which will find its way to flagship smartphones soon. Overall the Cortex-A77 announcement today isn’t quite as big of a change as what we saw last year with the A76, nor is it as big a change as today’s new announcement of Arm’s new Valhall GPU architecture and G77 GPU IP. However what Arm managed to achieve with the A77 is a continued execution of their roadmap, which is extremely important in the competitive landscape. The A76 delivered on all of Arm’s promises and ended up being an extremely performant core, all while remaining astonishingly efficient as well as having a clear density lead over the competition. In this regard, Arm’s major clients are still heavily focusing on having the best PPA in their products, and Arm delivers in this regard. The one big surprise about the A77 is that its floating point performance boost of 30-35% is quite a lot higher than I had expected of the core, and in the mobile space, web-browsing is the killer-app that happens to be floating point heavy, so I’m looking forward how future SoCs with the A77 will be able to perform. As linked above, the company also announced its next-generation mobile GPU architecture.
In 2017, we saw several new MCUs hit the market, as well as general trends continuing in the industry: the migration to open-source, cross-platform development environments and toolchains; new code-generator tools that integrate seamlessly (or not so seamlessly…) into IDEs; and, most notably, the continued invasion of ARM Cortex-M0+ parts into the 8-bit space. I wanted to take a quick pulse of the industry to see where everything is — and what I’ve been missing while backed into my corner of DigiKey’s web site. It’s time for a good ol’ microcontroller shoot-out.
Based on technology developed by Hewlett-Packard, Microsoft’s IntelliMouse Explorer arrived with a price tag that could be justified by even cash-strapped students like me. Even better, the underside of the mouse was completely sealed, preventing even the tiniest speck of dirt from penetrating its insides, and it improved on its predecessors by working on almost any surface that wasn’t too reflective. I remember getting back to my dorm room and plugging in the Explorer for the first time, wondering who had a rig fancy enough to use the included PS2 to USB adapter. There were undoubtedly a few driver installation hiccups along the way, but once Windows 98 was happy, I fired up Photoshop and strapped in for the smoothest mouse experience I’d ever had. Problem solved. The changeover from ball mice to optical mice is something few will ever rave about, but I remember it as one of the biggest changes in computer use I’ve personally ever experience. Everything about optical mice is better than ball mice, and using an optical mouse for the first time roughly two decades ago was a complete game-changer.
When OSNews covered the RISC V architecture recently, I was struck by my own lack of excitement. I looked into it, and the project looks intriguing, but it didn’t move me on an emotional level like a new CPU architecture development would have done many years ago. I think it’s due to a change in myself, as I have got older. When I first got into computers, in the early 80s, there was a vibrant environment of competing designs with different approaches. This tended to foster an interest, in the enthusiast, in what was inside the box, including the CPU architecture. Jump forwards to the current era, and the computer market is largely homogenized to a single approach for each class of computing device, and this means that there is less to get excited about in terms of CPU architectures in general. I want to look at what brought about this change in myself, and maybe these thoughts will resonate with some of you.