GNU Compiler Collection 6.3 released
GCC 6.3 is a bug-fix release from the GCC 6 branch containing important fixes for regressions and serious bugs in GCC 6.2 with more than 79 bugs fixed since the previous release.
GCC 6.3 is a bug-fix release from the GCC 6 branch containing important fixes for regressions and serious bugs in GCC 6.2 with more than 79 bugs fixed since the previous release.
A while back I decided to try to write a Game Boy emulator in Common Lisp based on this series of articles. I made some good progress but eventually got bogged down because I was trying to learn a bunch of complex new things at once.
Instead of dragging on, I decided to take a break and try something simpler: a CHIP-8 emulator/interpreter. The CHIP-8 is much simpler than the Game Boy, which made it easier to experiment with the rest of the infrastructure.
In this post and a couple of future ones I'll walk through all of my CHIP-8 emulator implementation.
When you have built your retro computer the chances are you’ll turn it on and be faced with a BASIC interpreter prompt. This was the standard interface for home computers of the 8-bit era, one from which very few products deviated. If you were a teenager plugging your family's first ever computer into the living-room TV then your first port of call after getting bored with the cassette of free educational games that came with it would have been to open the manual and immerse yourself in programming.
The trouble is, in the several decades since, 8-bit BASIC skills have waned a little. Most people under 40 will have rarely if ever encountered it, and the generation who were there on the living room carpet with their Commodore 64s (or whatever) would probably not care to admit that this is the sum total of their remembered BASIC knowledge.
10 PRINT "Hello World"
20 GOTO 10
If you have built a retro-computer then clearly this is a listing whose appeal will quickly wane, so where can you brush up your 8-bit BASIC skills several decades after the demise of 8-bit home computers?
When I was very, very young - I'm from 1984 - I did some very basic BASIC, mostly on an MSX, but I remember very little of it. BASIC programming didn't grab me as a kid, and as such, I never went down the programmer's path. Today, with an adult life with adult responsibilities, learning to program seems like such a daunting undertaking, for which I simply don't have the time.
The 1.13 release includes several extensions to the language, including the long-awaited ? operator, improvements to compile times, minor feature additions to cargo and the standard library. This release also includes many small enhancements to documentation and error reporting, by many contributors, that are not individually mentioned in the release notes.
The goal was to publish source code to a GPU that is register compatible with the late 90's era Number Nine "Ticket To Ride IV" GPU. Although the project didn't meet its funding goal, the person behind it later published the code on github.
Despite the fact that this is an older design, it has lots of stuff that is worth studying. It's interesting to compare this design to the VideoCore GPU that I walked through in a previous post. While there are some fundamental differences, there are surprising number of functions that are similar, which shows how modern GPUs evolved from earlier ones.
A walkthrough of the GPLGPU as well as some history and backstory of the Number Nine "Ticket To Ride IV" GPU.
The goal of this project is to preserve and present primary and secondary source materials (including specifications, source code, manuals, and papers discussing design and implementation) from Mesa, the system programming language designed at Xerox PARC in the 1970s and used to implement the Xerox Star office automation system and its follow-ons. The editor greatly appreciates comments, suggestions, and donations of additional materials.
Wikipedia has a short overview of Mesa, and here's the 1979 Mesa Language Manual, which is obviously a lot more in-depth.
By now, simply taking over a game and replacing it with a brand new app was beginning to feel a little predictable. So this year, TASBot decided to show off a new skill. At the AGDQ marathon, the bot set out to edit new features onto a game that's still running in active memory. TASBot wanted to be magnanimous with its new capabilities, too, allowing human players (and livestream viewers) the opportunity to edit the game on the fly.
But just how did TASBot - and the team of coders behind it - intend to turn an old game of Super Mario World, running on a standard SNES, into a heavily editable game of Super Mario Maker? Luckily, we had a behind-the-scenes invite to the event and the opportunity to find out.
I spent most of last week watching AGDQ (and donating, of course), and this particular segment blew my mind.
So, Castlevania: Symphony of the Night is one of my favourite games of all time, and it's also generally considered to be one of the best games ever made. And, as with all games, people 'speedrun' this game, which means trying to beat the game as fast as is humanly possible. There are several categories, each with certain rules and things that are and are not allowed.
This particular speedrun of Symphony Of The Night by Cosmo takes this concept to a whole new level. The end time of 7 minutes and 52 seconds is mind-blowing enough, but how he actually gets there is just utterly insane. Basically, he procures a very specific set of items in his inventory, and then proceeds to manipulate the items in his inventory in a extremely specific way, within very specific fractions of seconds of game-time, to use the sorting mechanism of the inventory to manipulate the assembly code in memory to make the game finish itself. All this, on the actual console itself, without tools, without additional software, without emulators, without anything.
The actual science or coding behind this technique was discovered and developed by a person named Sockfolder, and he put up a 40-minute stream to explain in detail what's going on, with the contents of memory on the side of the screen so you can see exactly what's happening. It's mesmerising (even though I don't fully understand what's going on).
While the actual coding part of it can be discovered and explored in relative comfort of an emulator and other tools, actually pulling this off 'live', with just the tools at the disposal of any regular player, is absolutely amazing. This kind of stuff sits at the very fringes of programming, and I find it incredibly impressive.
If you are an iOS developer, the Windows ecosystem can appear a strange and frightening place. Writing an app for Windows requires an investment in all kinds of new things: new tools (Visual Studio), new languages (C#), new APIs and Controls (Win32, XAML), new graphics engines (DirectX) and before you know it, life seems too short and wouldn't another Flappy Birds clone be more fun anyway?
Fear not, brave adventurer, for Project Islandwood is here.
Today, the US Department of Energy announced that it had established a partnership with NVIDIA that would be enhancing the LLVM compiler collection. The goal will be to port an existing FORTRAN compiler that targets massively parallel GPUs. The results are expected to be released as open source in late 2016.
Cutting-edge research still universally involves Fortran; a trio of challengers wants in. While FORTRAN isn't a mainstream language, it's still heavily used in scientific computing, and there's lots of legacy code that relies on it. A lot of that code is maintained by people at the US National Labs, and the new project is being organized by staff at Lawrence Livermore, Sandia, and Los Alamos.
League of Legends players collectively send millions of messages every day. They're asking friends to duo-queue, suggesting a team comp on the champ select screen, and thanking opponents for a good game. On July 21st of this year (I picked a day at random), players forged 1.7 million new friendships in the game - that's a lot of love! And each time players send a message they trigger a number of operations on the back-end technology that powers Riot chat.
In the previous episode of this series on chat, I discussed the protocol we chose to communicate between client and server: XMPP (Extensible Messaging and Presence Protocol). Today I'll dive into the mechanisms in place on the server-side and the architecture of the infrastructure, and I’ll discuss the work we’ve done to ensure that our servers are scalable and robust. Like the last article, I hope it’ll be interesting to anyone building out chat features to a distributed client base.
The best class I took in college was on the philosopher Ludwig Wittgenstein. Until that point, I had avoided philosophy of language as simply being too esoteric and hermetic to be of use. David Pears, a prodigious yet modest and approachable figure visiting from Oxford, changed my mind. In large part because of Pears' instruction, Wittgenstein's philosophy has been directly relevant to my thinking about computer science, artificial intelligence, and cognitive science. When other scholars were thinking that language and thought could be reduced to a universal, logical language, Wittgenstein turned the matter to practical questions and raised incredibly inconvenient questions that gained traction in artificial intelligence in the 1970s, 40 years after he was working on them.
Great article. I found this paragraph especially interesting:
Here's one example. The French equivalents for here and there are ici and là respectively. But if I point to a pen and say, "The pen is here," the French equivalent is not "Le stylo est ici," but "Le stylo est là ." In French, là is always used to refer to a specific place or position, while in English here or there can both work. This rule is so obscure I never learned it in French classes, but obviously all native speakers learn it because no one ever uses it differently. It could just as easily be the other way round, but it's not. The situation is not arbitrary, but the way in which language carves up the interaction between mind and world varies in such a way that French speakers recognize certain practices as right or wrong in a different way than English speakers do. This may seem a trivial point, until you have to program a computer to translate "I pointed to Paris on the map and said, 'She is here.' " into French - at which point it becomes a nightmare. (If you are a translator, on the other hand, this is great news.)
Aside from the obvious fact that I can relate to the remark about translators, the author touches upon something that I benefit from every day. I always feel that being multilingual (just Dutch, English, German, some French, and a basic grasp of ancient Greek and Latin - relatively limited when compared to true multilinguals) makes it easier for me to express myself. Being able to use words, concepts, ideas, structures, and conventions from foreign languages and incorporate them into my Dutch - even if only in my inner monologue - allows me to describe objects, concepts, and situations in a more fine-grained, and therefore, more accurate manner (accurate to my perception, which does not mean "more correct" in more absolute terms).
I appreciate how ridiculously pretentious this sounds, but I do firmly believe this is true: being able to understand, read, write, and speak multiple language makes me better at language.
I'm no programmer - something I like to repeat as often as I can to make sure everyone knows where I'm coming from on the subject of programming - but I get the idea that programming is not very different in that regard. That is, being able to program in multiple programming languages will make you better at programming, and not just in the sense that you will be useful in more situations (you can find a job both as a Java and an Objective-C programmer, for instance), but also in the sense that knowledge and experience in programming language Abc will give you new and different insights into programming language Xyz, allowing you to use a certain language in more unconventional ways that people with knowledge of fewer languages might not.
As much as language is an expression of culture, a programming language is an expression of how a computer works. Both contain within them invaluable knowledge that cannot be easily expressed in other languages - and as such, they are invaluable in preserving knowledge, both culturally and digitally.
I spent a lot of time as a kid playing (generally, pretty terrible) games on my Game Boy. Having never written code for anything other than 'regular' general purpose computers before, I've been wondering recently: how easy is it to write a Game Boy (Advance) game?
This release includes significant changes to the implementation. The compiler tool chain was translated from C to Go, removing the last vestiges of C code from the Go code base. The garbage collector was completely redesigned, yielding a dramatic reduction in garbage collection pause times. Related improvements to the scheduler allowed us to change the default GOMAXPROCSÂ value (the number of concurrently executing goroutines) from 1 to the number of logical CPUs. Changes to the linker enable distributing Go packages as shared libraries to link into Go programs, and building Go packages into archives or shared libraries that may be linked into or loaded by C programs (design doc).
This is follow up to a previously posted challenge to virtualize VenturComm Venix/86 so that it can be run on a modern machine under an emulator. The competition was a huge success and the rest of this post is an entry by the winner - Jim Carpenter. Enjoy!
The Facebook application for Android isn't exactly, shall we say, best-in-class for a multitude of reasons, but at least Facebook is trying to improve it. This is their latest effort.
In our exploration of alternate formats, we came across FlatBuffers, an open source project from Google. FlatBuffers is an evolution of protocol buffers that includes object metadata, allowing direct access to individual subcomponents of the data without having to deserialize the entire object (in this case, a tree) up front.
Might be useful for other Android developers as well.
But the people calling for a bytecode for the browser never went away, and they were never entirely wrong about the perceived advantages. And now they're going to get their wish. WebAssembly is a new project being worked on by people from Mozilla, Microsoft, Google, and Apple, to produce a bytecode for the Web.
WebAssembly, or wasm for short, is intended to be a portable bytecode that will be efficient for browsers to download and load, providing a more efficient target for compilers than plain JavaScript or even asm.js. Like, for example, .NET bytecode, wasm instructions operate on native machine types such as 32-bit integers, enabling efficient compilation. It's also designed to be extensible, to make it easy to add, say, support for SIMD instruction sets like SSE and AVX.
PicoC is a very small C interpreter for scripting. It was originally written as a script language for a UAV's on-board flight system. It's also very suitable for other robotic, embedded and non-embedded applications.
The core C source code is around 3500 lines of code. It's not intended to be a complete implementation of ISO C but it has all the essentials. When compiled it only takes a few k of code space and is also very sparing of data space. This means it can work well in small embedded devices. It's also a fun example of how to create a very small language implementation while still keeping the code readable.
Today we are very proud to announce the 1.0 release of Rust, a new programming language aiming to make it easier to build reliable, efficient systems. Rust combines low-level control over performance with high-level convenience and safety guarantees. Better yet, it achieves these goals without requiring a garbage collector or runtime, making it possible to use Rust libraries as a "drop-in replacement" for C. If you'd like to experiment with Rust, the "Getting Started" section of the Rust book is your best bet (if you prefer to use an e-reader, Pascal Hertleif maintains unofficial e-book versions as well).
I work for a certain corporation which uses a certain product. This is its story. To put the quality of this product into perspective, let me say it's been in development for about 20 years and has pretty much no users (besides my corp and some "hey - let's make our own Linux crappy distro, which no one will ever use" fanatics) and no community. It was written by a C programmer who "doesn't like the notion of 'type' in programming". Let that be a prelude of what's to follow. Envy those who don't know it; pity those who use it.
The product is called Enlightenment Foundation Libraries and it's the absolutely worst piece of shit software you can imagine.
Poor Tizen.