Without question, 2018 was the year RISC-V genuinely began to build momentum among chip architects hungry for open-source instruction sets. That was then.
By 2019, RISC-V won't be the only game in town.
Wave Computing announced Monday that it is putting MIPS on open source, with MIPS Instruction Set Architecture (ISA) and MIPS' latest core R6 available in the first quarter of 2019.
Good news, and it makes me wonder - will we ever see a time where x86 and x86-64 are open source? I am definitely not well-versed enough in these matters to judge just how important the closed-source nature of the x86 ISA really is to Intel and AMD, but it seems like something that will never happen.
It's pretty simple to archive Commodore 64 tapes, but it's hard if you want to do it right. Creating the complete archive of the German "INPUT 64" magazine was not as easy as getting one copy of each of the 32 tapes and reading them. The tapes are over 30 years old by now, and many of them are hardly readable any more.
Due to their size and lack of portability, 17-inch notebooks are not exactly popular among road warriors. Instead this is largely the domain of desktop replacement-class machines, which in turn has caused 17-inch laptops to be built bigger still in order to maximize their performance and emphasize the replacement aspect. Every now and then however we see a 17-inch laptop that still tries to be reasonably portable, and this is the case with LG's latest gram laptop, which hit the market this week.
Equipped with a 17.3-inch screen featuring a 2560Ã—1600 resolution, the LG gram 17 comes in a dark silver Carbon Magnesium alloy chassis that is only 17.8 mm (0.7 inches) thick, which is thinner than most 15-inch notebooks (in fact, this even thinner than the ASUS ZenBook Pro 15). Meanwhile, the laptop weighs 1.33 kilograms (2.95 pounds), which is in line with many 13-inch mobile PCs. As a result, while the 17-inch gram still has a relatively large footprint, its still a relatively portable laptop.
I'm genuinely surprised LG decided to put this 17-incher on the market - consider it a sort of spiritual successor to the 17" PowerBook G4, in my view one of the best laptops ever made. It seems like the market has pretty much settled on 12"-13", with a few professional and low-end laptops offering a 15" screen. I hope this LG laptop is at least even a modest success, because I'd love for more 17" laptops to make it to market.
Today is the second day of Qualcomm's Snapdragon Technology Summit in Maui, and while yesterday was all about 5G and a teaser for its new chipset, today is all about the Snapdragon 855. The new chipset is built on a 7nm architecture, promising faster speeds, better battery life, and improved connectivity.
But as far as general performance goes, Qualcomm says that its Kryo 485 cores will offer a 45% boost, and the Adreno 640 GPU will show a 20% increase. With the firm's Snapdragon Elite Gaming Platform, gamers will be able to play in HDR with physically based rendering (PBR).
If these numbers hold up - only independent benchmarking will tell - this will go a ways to closing the wide gap with Apple's current offering at least partially.
Today we are launching EC2 instances powered by Arm-based AWS Graviton Processors. Built around Arm cores and making extensive use of custom-built silicon, the A1 instances are optimized for performance and cost. They are a great fit for scale-out workloads where you can share the load across a group of smaller instances. This includes containerized microservices, web servers, development environments, and caching fleets.
Interesting to see Amazon design its own ARM core specifically for its own product.
Among obscure pop culture tidbits and stories about wacky inventions, Tedium has often documented the continued survival of technology long thought of as obsolete. From calculagraphs to COBOL, we love hearing that ancient tech survives in the 21st century and revel in the uses that keep them around. So it was surprising to dig through the Tedium archives looking for something I expected to find, but didn't. Today, we're righting that wrong and diving into the robust and thriving world of a technology that was foundational to the progress humanity made during the 20th century. Today's Tedium is talking vacuum tubes.
Everybody has one. At least one. Collecting dust in a closet somewhere; waiting to be thrown away. It's not a time capsule per-se, but if you looked at it now it would probably show you a snapshot of a life you lived not that long ago. It was once a source of pride, entertainment, accomplishment or perhaps comfort. Maybe it was a status symbol. Now you would call it useless, worthless, junk.
We're not talking about the photo album from your dormroom party days, although it might still contain a copy. We’re talking about your old PC, laptop, netbook, or computer. That thing you spent hundreds or thousands of dollars on to sit in front of for hours doing whatever it is that you do. Maybe it helped you get a degree, or maybe it was your primary source of income. Doesn’t matter now anyway. Your smart-toaster does more MIPS and FLOPS with half the power! There's no value in an old computer, right?
Wrong! If the commoditization of computing hardware and the steady marching of Moore's law has done anything to old computers it has been to breathe new life into them. How, you ask?
Putting old hardware to new uses is one way of recycling - I tend to give away my "old" smartphones as I buy new ones way too often. Often, a friend's phone stopped working or a family member needs a new one - so I just give them mine.
From the Byte Cellar:
What inspired me to pull the Model 4 down off the shelf were a number of tweets from telnet BBS pals showing the system being put to great use logged into various systems across the web. Some of the screenshots showed the machine rendering ANSI "graphics" onscreen and I looked into it. As I suspected, the stock Model 4 is not capable of taking on a custom character set such as is needed by ANSI emulation, and I discovered the system had been equipped with a graphics board and the ANSI-supporting terminal program, ANSITerm, was rendering "text" to a graphics display; the character set was basically a software font.
And I just had to go there.
Microsoft is currently running an interesting set of hardware experiments. The company is taking a souped-up shipping container stuffed full of computer servers and submerging it in the ocean. The most recent round is taking place near Scotland's Orkney Islands, and involves a total of 864 standard Microsoft data-center servers. Many people have impugned the rationality of the company that put Seattle on the high-tech map, but seriously - why is Microsoft doing this?
There are several reasons, but one of the most important is that it is far cheaper to keep computer servers cool when they're on the seafloor. This cooling is not a trivial expense. Precise estimates vary, but currently about 5 percent of all energy consumption in the U.S. goes just to running computers - a huge cost to the economy as whole. Moreover, all that energy used by those computers ultimately gets converted into heat. This results in a second cost: that of keeping the computers from melting.
I use a custom watercooling loop to keep my processor and videocard cool, but aside from size and scale, datacenters struggle with the exact same problem - computers generate a ton of heat, and that heat needs to go somewhere.
The Spectrum was not the first Sinclair computer to make it big. It was, however, the first to go massive. In the months prior to launching, 'The Computer Programme' had aired on the BBC, legitimising the home micro computer as the must have educational item of the 1980's. For Sinclair and the ZX Spectrum the time was right, parents were keen, and the kids were excited. Games would soon be were everywhere thanks to all the kids programming their brand new Spectrums.
A major success factor, the one that gave the Spectrum its name, is the computer's capacity to generate a spectrum of colours. The micro is capable of generating 16 colours; 8 low intensity colours and 8 matching bright variants. It's hard to imagine now, but in 1982 these 16 colours were enough to start a home computer revolution. Richard Altwasser, the engineer employed by Sinclair to develop the Spectrum's graphic systems, was setting a new benchmark with some very innovative ideas.
I've missed the entire 8 bit home micro revolution - I simply was too young or not even born yet. It must've been such an exciting time.
This site hosts the current version of the retro-B5500 emulator, an implementation of the legendary Burroughs B5500 that runs in a web browser.
The Wikipedia page has more information on the unique Burroughs B5500 line of machines from the 1960s.
It's almost the end of 2018, but I'm finally able to say that almost all of my day-to-day devices have been replaced with a USB-C option, or can be replaced in the near future.
I bought a fully specced out Dell XPS 13, and it's the first laptop I've ever had that charges over USB-C. Cool and all, but I quickly realized that only the 27W charger it came with actually charges it; other USB-C chargers simply don't work because they're not powerful enough.
I'm not quite sure USB-C is there, yet.
A Z80 computer wirewrapped on perfboard. The wirewrapping technique uses standard IC sockets and PCB header pins, so the components and wiring are on the same side of the board.
This is such cool engineering. I wish I had more time and base knowledge to dive into making things like this myself. I absolutely love building LEGO sets, and this feels like very, very advanced LEGO.
Studies show that the amount of data being recorded is increasing at 30 to 40 percent per year. At the same time, the capacity of modern hard drives, which are used to store most of this, is increasing at less than half that rate. Fortunately, much of this information doesn’t need to be accessed instantly. And for such things, magnetic tape is the perfect solution.
Seriously? Tape? The very idea may evoke images of reels rotating fitfully next to a bulky mainframe in an old movie like Desk Set or Dr. Strangelove. So, a quick reality check: tape has never gone away!
Developed in a magic night of 19 Aug, 2018 between 2am and 8am, the darkriscv is a very experimental implementation of the opensource RISC-V instruction set. Nowadays, after one week of exciting sleepless nights of work (which explains the lots of typos you will found ahead), the darkriscv reached a very good quality result, in a way that the "hello world" compiled by the standard riscv-elf-gcc is working fine!
I feel incompetent.
Acer's leading gaming branding, Predator, is all about maximizing performance, particularly around gaming. In the modern era, that now extends into content creation, streaming, video editing, and all the sorts of things that drive the need for high performance. As we've seen several times over the years, just throwing more cores at the problem isn't the solution: bottlenecks appear elsewhere in the system. Despite this, Acer is preparing a mind-boggling solution.
The Acer Predator X is the new dual-Xeon workstation, with ECC memory and multiple graphics cards, announced today at IFA 2018. The premise of the system is for the multi-taskers that do everything: gaming, content creation, streaming, the lot. With this being one of Acer's flagship products, we expect it to be geared to the hilt: maximum cores, maximum capacity. There-in lies the first rub: if Acer is going all out, this is going to cost something crazy.
This clearly makes zero sense, but at the same time, it's kind of awesome Acer is doing this. Dual-processor workstations are a bit of an obsession for me, but with dual-processor machines entirely relegated to Xeon systems, they've become quite unaffordable. Even though it makes zero sense, I would love for regular Intel Core and AMD Zen processors to support dual processor setups.
In essence, it's a 32-bit RISC ISA designed from a holistic view on integer, floating point, scalar and vector operations. In addition there is a hardware implementation of a single issue, in order, pipelined CPU. The hardware implementation mostly serves as an aid in the design of the ISA (at the time of writing the CPU is still incomplete).
As happens with some articles I post here, this one's definitely a bit over my head.
Recently, Intel bought Altera, one of the largest producers of FPGAs. Intel paid a whopping $16.7 billion, making it their largest acquisition ever. In other news, Microsoft is using FPGAs in its data centers, and Amazon is offering them on their cloud services. Previously, these FPGAs were mainly used in electronics engineering, but not so much in software engineering. Are FPGAs about to take off and become serious alternatives to CPUs and GPUs?
FPGAs are used extensively by e.g. the Amiga community to recreate older chipsets.