The 76477 Complex Sound Generation chip (1978) provided sound effects for Space Invaders and many other video games. It was also a popular hobbyist chip, easy to experiment with and available at Radio Shack. I reverse-engineered the chip from die photos and found some interesting digital circuitry inside. Perhaps the most interesting is a shift register based white noise generator, useful for drums, gunshots, explosions and other similar sound effects. The chip also uses a digital mixer to combine the chip's different sound generators. An unusual feature of the chip is that it uses Integrated Injection Logic (I2L), a type of digital logic developed in the 1970s with the goal of high-density, high-speed chips.
I've been trying to make my computers quieter for nearly three decades. Custom liquid cooling loops, magnetically-stabilised fluid-dynamic bearings, acoustic dampeners, silicone shock absorbers, you name it. Well, last week I finally managed to build a completely silent computer. Without further ado...
The Streacom DB4 is an amazing chassis and case, which I am considering for one of my next computer builds. This article provides great insight into building such a fanless PC, with links to additional articles about the system later in its lifespan.
So the objective here was to take a C64 breadbin case and keyboard and put a Raspberry Pi 3 into it; keeping the keyboard and joystick ports working, but also giving me HDMI, USB controller support, and modem emulation. While I still have 2 real Commodore 64s (and an Ultimate64 on the way!), I like using the RPi and Vice to play 64 games.
These mounts do not require you to drill or cut your C64 case! The 3D files are provisioned under the creative commons license so they are FREE to use, distribute, modify, or even sell.
Just a fun project.
This build is a 10cm x 10cm x 10cm replica of the NeXT Computer to house a Raspberry Pi computer. I designed and built this specifically with the aim of having it run some basic server tasks on my home network, such as storing revision control repositories etc.
The necessary files to make your own are available. What a neat-looking case - I'd love a similar one, but slightly larger so it can house a mini-ITX board. I would love to build a Ryzen II machine in a case like this.
But a laptop is more than just a video playback machine. For myself and millions of others, it's the primary tool for earning a living. We use these machines to read, write, remember, create, connect, and communicate. And in most of these other applications, a 16:9 screen of 13 to 15 inches in size just feels like a poor fit.
As long as I can easily open more than one document side by side, any aspect ratio gets my blessing. I don't mind black bars on video, especially since today's screens have pretty good black levels, so they're hardly distracting. Still, I'm glad more and more laptop makers are starting to see the benefit in 3:2-like displays.
It was supposed to be the laptop that saved the world.
In late 2005, tech visionary and MIT Media Lab founder Nicholas Negroponte pulled the cloth cover off a small green computer with a bright yellow crank. The device was the first working prototype for Negroponte's new nonprofit One Laptop Per Child, dubbed "the green machine" or simply "the $100 laptop". And it was like nothing that Negroponte's audience - at either his panel at a UN-sponsored tech summit in Tunis, or around the globe - had ever seen.
The OLPC was all the rage and hype for a few years back then, but it never materialised. Still, while not nearly the same thing, cheap mobile phones and smartphones have played a somewhat similar role.
Cloudflare, which operates a content delivery network it also uses to provide DDoS protection services for websites, is in the middle of a push to vastly expand its global data center network. CDNs are usually made up of small-footprint nodes, but those nodes need to be in many places around the world.
As it expands, the company is making a big bet on ARM, the emerging alternative to Intel’s x86 processor architecture, which has dominated the data center market for decades.
The money quote from CloudFlare's CEO:
"We think we're now at a point where we can go one hundred percent to ARM. In our analysis, we found that even if Intel gave us the chips for free, it would still make sense to switch to ARM, because the power efficiency is so much better."
The HP 9000 Series of computers spanned almost three decades and very diverse platforms of Unix computers. Both RISC and Unix, with a longer history, were developed into coherent products during the 1980s, moving from academia via industrial R&D to productization at a time when much computing was still done on mainframes, minicomputers and time-sharing machines such as DEC PDP, VAX, IBM AS/400 and System/360.
Paul Weissmann tells the story of the development and history of the HP9000.
I stumbled upon an absolute gem of a website over the weekend - Sophie Haskins' Pizza Box Computer. On this site, Haskins details a number of ancient non-x86 workstations. All of the posts on the site are fun and interesting reads, so let's pick one of her machines - a DEC Multia running Windows NT for Alpha -
The Multia was an attempt by Digital to make a lower-cost Alpha workstation for running Windows NT. There were Alpha and Intel Pentium models, and they use a lot of off-the-shelf PC components rather than custom Digital ones (hence its later name, the "Universal Desktop Box"). It's quite tiny - so much so that it has laptop PCMCIA slots for expansion!
The latest post details getting Windows NT up and running on the Multia, and is certainly worth a read - like the rest of the site.
If there's one thing that will make even the most powerful computer feel like a 7 year old rig, it's Adobe Lightroom paired with RAW files from any high-megapixel camera.
In my case, I spent over a year of spare time editing 848GB worth of 11,000+ 42-megapixel RAW photos and 4K videos from my New Zealand trip and making these nine photosets. I quickly realized that my two year old iMac was not up to the challenge.
In 2015 I took a stab at solving my photo storage problem with a cloud-backed 12TB Synology NAS. That setup is still running great. Now I just need to keep up with the performance requirements of having the latest camera gear with absurd file sizes.
I decided it was time to upgrade to something a bit more powerful. This time I decided to build a PC and switch to Windows 10 for my heavy computing tasks. Yes, I switched to Windows.
I love articles like this, because there is no one true way to build a computer for any task, and everyone has their own opinions and ideas and preferences, making sure not one self-built PC is the same as anyone else's. Add in a healthy dose of urban legends and tradition, and you have a great cocktail for endless discussions that never go anywhere.
It's clickbait without actually being clickbait.
The disclosure of the Meltdown and Spectre vulnerabilities has brought a new level of attention to the security bugs that can lurk at the hardware level. Massive amounts of work have gone into improving the (still poor) security of our software, but all of that is in vain if the hardware gives away the game. The CPUs that we run in our systems are highly proprietary and have been shown to contain unpleasant surprises (the Intel management engine, for example). It is thus natural to wonder whether it is time to make a move to open-source hardware, much like we have done with our software. Such a move may well be possible, and it would certainly offer some benefits, but it would be no panacea.
Given the complexity of modern CPUs and the fierceness of the market in which they are sold, it might be surprising to think that they could be developed in an open manner. But there are serious initiatives working in this area; the idea of an open CPU design is not pure fantasy. A quick look around turns up several efforts; the following list is necessarily incomplete.
Today marks a major milestone in the processor industry - we've launched Qualcomm Centriq 2400, the world's first and only 10nm server processor. While this is the culmination of an intensive five-year journey for the Qualcomm Datacenter Technologies (QDT) team, it also marks the beginning of an era that will see a step function in the economics and energy efficiency of operating a datacenter.
The Intel Management Engine (ME), which is a separate processor and operating system running outside of user control on most x86 systems, has long been of concern to users who are security and privacy conscious. Google and others have been working on ways to eliminate as much of that functionality as possible (while still being able to boot and run the system). Ronald Minnich from Google came to Prague to talk about those efforts at the 2017 Embedded Linux Conference Europe.
The Xerox Alto, widely recognized as the first modern personal computer, pioneered just about every basic concept we are familiar with in computers today. These include windows, bit-mapped computer displays, the whole idea of WYSIWIG interfaces, the cut/paste/copy tools in word processing programs, and pop-up menus. Most of this vision of the "office of the future" was first unveiled at a meeting of Xerox executives held on 10 Nov 1977, which was 40 years ago last week.
To celebrate that birthday, the Computer History Museum in Mountain View, Calif., brought together some of Parc researchers who worked on the Alto on Friday. They put it through its paces in a series of live demos. These demos used an Alto that had been restored to working order over the past eight months.
One of the most important computers ever made.
There really is no rational reason to restore a late 90s NEC-manufactured Packard Bell computer. Which is exactly why I'm doing it. Join me in getting this unloved machine back to factory fresh condition!
LGR is one of the best and most entertaining technology channels on YouTube, and his latest video from today hits home particularly hard, since these kinds of crappy, low-budget late '90s PCs defined my early teens. Nobody in my family, town, or school had Macs or other types of computers - it was all PC, as cheap as possible, fully embracing the race to the bottom which for many people still defines the PC today.
It's good to see that there are people willing to preserve these otherwise forgettable machines for posterity. They may objectively suck, but they did make computing accessible to an incredibly wide audience, and they served an important role in the history of computing.
As an embedded design consultant, the diverse collection of projects on my desk need an equally-diverse collection of microcontroller architectures that have the performance, peripheral selection, and power numbers to be the backbone of successful projects. At the same time, we all have our go-to chips - those parts that linger in our toolkit after being picked up in school, through forum posts, or from previous projects.
In 2017, we saw several new MCUs hit the market, as well as general trends continuing in the industry: the migration to open-source, cross-platform development environments and toolchains; new code-generator tools that integrate seamlessly (or not so seamlessly...) into IDEs; and, most notably, the continued invasion of ARM Cortex-M0+ parts into the 8-bit space.
I wanted to take a quick pulse of the industry to see where everything is - and what I've been missing while backed into my corner of DigiKey’s web site.
An amazingly detailed and well-organised resource.
We succeeded in running the Smalltalk-76 language on our vintage Xerox Alto; this blog post gives a quick overview of the Smalltalk environment. One unusual feature of Smalltalk is you can view and modify the system's code while the system is running. I demonstrate this by modifying the scrollbar code on a running system.
Smalltalk is a highly-influential programming language and environment that introduced the term "object-oriented programming" and was the ancestor of modern object-oriented languages. The Alto's Smalltalk environment is also notable for its creation of the graphical user interface with the desktop metaphor, icons, scrollbars, overlapping windows, popup menus and so forth. When Steve Jobs famously visited Xerox PARC, the Smalltalk GUI inspired him on how the Lisa and Macintosh should work.
Be sure to read the comments after the article itself, since it includes comments and clarifications from none other than Alan Kay himself.
A couple years ago, Lenovo announced its plans to build a "retro" ThinkPad that would resurrect design elements of ThinkPads past as an homage to the brand's long history.
That ThinkPad is now real. Check out the ThinkPad 25, sold to commemorate 25 years of ThinkPads.
I'm just going to leave this here for you lovely ThinkPad people. This isn't for me, but I'm not here to ruin your party.
Do clean up after yourselves.
If you're a demanding computer user, sometimes your 13-inch Ultrabook laptop just won't quite cut it. For those looking for a little more computing power, HP's new Z8 workstation could be just the answer. The latest iteration of HP's desktop workstations packs in a pair of Intel Skylake-SP processors, topping out with twinned Xeon Platinum 8180 chips: 28 cores/56 threads and 38.5MB cache each running at 2.5-3.8GHz, along with support for up to 1.5TB RAM.
Next year, you'll be able to go higher still with the 8180M processors; same core count and speeds, but doubling the total memory capacity to 3TB, as long as you want to fill the machine's 24 RAM slots.
Those processors and memory can be combined with up to three Nvidia Quadro P6000 GPUs or AMD Radeon Pro WX 9100 parts if you prefer that team. The hefty desktop systems have four internal drive bays, two external (and a third external for an optical drive), and nine PCIe slots. Storage options include up to 4TB of PCIe-mounted SSD, and 48TB of spinning disks. A range of gigabit and 10 gigabit Ethernet adaptors are available; the machines also support 802.11a/b/g/n/ac Wi-Fi and Bluetooth 4.2. Thunderbolt 3 is available with an add-in card.
This is one hell of a beast of a machine, and something most of us will never have the pleasure to use. That being said - I've always been fascinated by these professional workstations, and the HP ones in particular. Current models are obviously way out of my price range, but older models - such as a model from the Z800 range - are more attainable.
This paper is a gentle but rigorous introduction to quantum computing intended for computer scientists. Starting from a small set of assumptions on the behavior of quantum computing devices, we analyze their main characteristics, stressing the differences with classical computers, and finally describe two well-known algorithms (Simon's algorithm and Grover's algorithm) using the formalism developed in previous sections. This paper does not touch on the physics of the devices, and therefore does not require any notion of quantum mechanics.
Some light reading before bedtime.