General Development Archive

x86-64 assembly language programming with Ubuntu

The purpose of this text is to provide a reference for University level assembly language and systems programming courses. Specifically, this text addresses the x86-64 instruction set for the popular x86-64 class of processors using the Ubuntu 64-bit Operating System (OS). While the provided code and various examples should work under any Linux-based 64-bit OS, they have only been tested under Ubuntu 14.04 LTS (64-bit).

Your light reading for the weekend.

Dart 2.0 released

Coming from Dart 1, there's two major developer-facing changes, the largest of which is a stronger type system, including runtime checks to help catch errors that would arise from mismatched or incorrectly labeled types. This type system, originally called "strong mode", has long been the default in Flutter. The other is an interesting quality-of-life change for Flutter developers, which allows creating an instance of a class without the "new" keyword. The goal of this change is to make Flutter code more readable, less clunky, and easier to type, but the principle applies to all Dart code.

The complete list of changes has all the details.

Hello world on z/OS

If you've followed any one of the amazing tutorials on how to set up a mainframe on a conventional personal computer, you've probably noticed they end with the login screen as if everything beyond that point will be intuitive and self-explanatory to newbies. I mean... That was my assumption going into this project. I'll figure it out. How hard could it be? Maybe it would take me a few hours. Maybe I'd have to Google some stuff... Read some documentation...

It took me over a week.

Over a week to figure out enough to compile and run a basic program.

Why Discord is sticking with React Native

Looking back at the past three years, React Native has proven to be extremely successful at Discord and helped drive our iOS user adoption from zero to millions!

More specifically, React Native has allowed us to reap the benefits of quickly leveraging reusable code across platforms, as well as develop a small and mighty team.

Meanwhile, we've learned to adapt to its inevitable pain points without sacrificing overall productivity.

We all complain a lot about these non-native, cross-platform frameworks, but it's only fair to also highlight the other side of the coin - in this case, the view from the developers of an incredibly popular application who need to easily support multiple platforms.

Small computer system supports large-scale multi-user APL

Another article from a very much bygone era - we're talking 1977, and for sure this one's a bit over my head. I like being honest.

APL (A Programming Language) is an interactive language that allows access to the full power of a large computer while maintaining a user interface as friendly as a desktop calculator. APL is based on a notation developed by Dr. Kenneth Iverson of IBM Corporation over a decade ago, and has been growing in popularity in both the business and scientific community. The popularity of APL stems from its powerful primitive operations and data structures, coupled with its ease of programming and debugging.

Most versions of APL to date have been on large and therefore expensive computers. Because of the expense involved in owning a computer large enough to run APL, most of the use of APL outside of IBM has been through commercial timesharing companies. The introduction of APL 3000 marks the first time a large-machine APL has been available on a small computer. APL 3000 is a combination of software for the HP 3000 Series II Computer System2 and a CRT terminal, the HP 2641A, that displays the special symbols used in APL.

Enjoy.

The land before binary

The IRS has a lot of mainframes. And as millions of Americans recently found out, many of them are quite old. So as I wandered about meeting different types of engineers and chatting about their day-to-day blockers I started to learn much more about how these machines worked. It was a fascinating rabbit hole that exposed me to things like "decimal machines" and "2 out of 5 code". It revealed something to me that I had not ever considered:

Computers did not always use binary code.

Computers did not always use base 2. Computers did not always operate on just an on/off value. There were other things, different things that were tried and eventually abandoned. Some of them predating the advent of electronics itself.

Here's a little taste of some of those systems.

I've often wondered why computers are binary to begin with, but it was one of those stray questions that sometimes pops up in your head while you're waiting for your coffee or while driving, only to then rapidly disappear.

I have an answer now, but I don't really understand any of this.

UTC is enough for everyone… Right?

Programming time, dates, timezones, recurring events, leap seconds... Everything is pretty terrible.

The common refrain in the industry is Just use UTC! Just use UTC! And that's correct... Sort of. But if you're stuck building software that deals with time, there's so much more to consider.

It's time... To talk about time.

This is one of the best articles - experiences? - I've ever read. It's funny, well-written, deeply informative, and covers everything from programming with time, to time and UI design, to time and accessibility. This is simply an amazing piece of work.

C is not a low-level language

In the wake of the recent Meltdown and Spectre vulnerabilities, it's worth spending some time looking at root causes. Both of these vulnerabilities involved processors speculatively executing instructions past some kind of access check and allowing the attacker to observe the results via a side channel. The features that led to these vulnerabilities, along with several others, were added to let C programmers continue to believe they were programming in a low-level language, when this hasn't been the case for decades.

Processor vendors are not alone in this. Those of us working on C/C++ compilers have also participated.

The soon-to-be-extinct embedded software engineer

Embedded systems have started to become extremely complex. The big push to connect every device to the internet to create the IoT is causing a demand for embedded software engineers that has not yet been seen in recent history. This big push is causing a vacuum in which companies can't find enough embedded software engineers. Instead of training new engineers, they are starting to rely on application developers, who have experience with Windows applications or mobile devices, to develop their real-time embedded software. The problem, of course, is that these engineers don't understand the low-level hardware, but only high-level application frameworks that do all the work for them.

Is this actually true? It's very difficult to gauge this, since most focus when it comes to development is on "sexy" development, such as smartphone applications or websites - there's very little media visibility for lower-level engineering such as embedded developers, kernel engineers, and so on. Since I know how easy it is to fall into the trap of believing that everything was better in the past, I genuinely wonder if this is really actually a problem, or that we just perceive it as such.

The desktop belongs to Electron

This doesn’t have to be forever. Maybe in the future, developers will start using React Native to build desktop applications. Or perhaps Flutter! Electron apps have a bad reputation for using too much RAM, have potential security issues, can’t (yet) match the speed of C++, and they often lack the polish and familiarity of a great native app.

But it seems clear to me that OS-specific SDKs are becoming a liability for desktop OS vendors. Developers want to use the technologies they know, and they want maximum reach for the products they build. And they’re smart enough to get what they want. A lack of cooperation on the part of Apple, Google, and Microsoft will only hurt users.

Say hello to your new Electron overlord.

At 33, I'm perhaps staring to show signs of becoming an old man, but I really don't like Electron applications. I use Discord every day, and it just feels slow, cumbersome, and out of place on my virtually 100% Modern/Fluent Design Windows desktop, Surface, and my iPhone X. I greatly prefer proper, platform-specific native applications, but I feel that ship may have sailed with things like Electron and Progressive Web Apps.

I'm not looking forward to this future.

The story of ispc

I've decided to write up a little history of ispc, the compiler I wrote when I was at Intel. There's a lot to say, so it'll come out in a series of posts over the next few weeks. While I've tried to get all the details right and properly credit people, this is all from my memory. For anyone who was around at the time, please send an email if you see any factual errors.

The above links to the first part in the series - there's a table of contents for the entire series.

A constructive look at the Atari 2600 BASIC cartridge

Honestly, I don't think the Atari 2600 BASIC has ever had a fair review. It's pretty much reviled as a horrible program, a horrible programming environment and practically useless. But I think that's selling it short. Yes, it's bad (and I'll get to that in a bit), but in using it for the past few days, there are some impressive features on a system where the RAM can't hold a full Tweet and half the CPU time is spent Racing The Beam. I'll get the bad out of the way first.

Input comes from the Atari Keypads, dual 12-button keypads. If that weren't bad enough, I'm using my keyboard as an emulated pair of Atari Keypads, where I have to keep this image open at all times.

Okay, here's how this works.

An older story - it's from 2015 - but fascinating nonetheless.

ARM GCC cross compilation in Visual Studio

In Visual Studio 2017 15.5 Preview 2 we are introducing support for cross compilation targeting ARM microcontrollers. To enable this in the installation choose the Linux development with C++ workload and select the option for Embedded and IoT Development. This adds the ARM GCC cross compilation tools and Make to your installation.

Our cross compilation support uses our Open Folder capabilities so there is no project system involved. We are using the same JSON configuration files from other Open Folder scenarios and have added additional options to support the toolchains introduced here. We hope that this provides flexibility for many styles of embedded development. The best way to get started with this and understand the capabilities is with a project exported from the ARM mbed online compiler. We'll cover the basics here, to learn more about the online compiler see ARM’s tutorials, and you can sign up for an account here.

Arcan 0.5.3, Durden 0.3 released

Once again there's been a release of the "game engine meets display server meets multimedia framework" project, Arcan, and of its reference desktop environment Durden.

Among the many new engine feature this time around, we find: improved crash recovery, much improved support for Wayland clients, and initial support for OpenBSD. Among the DE features, we find window slicing and overlays, input multicasting, and LED controller profiles.

Refer to the full release announcement for more details and videos.

Swift 4.0 released

Swift 4 is now officially released! Swift 4 builds on the strengths of Swift 3, delivering greater robustness and stability, providing source code compatibility with Swift 3, making improvements to the standard library, and adding features like archival and serialization.

You can watch a quick overview of it by watching the WWDC 2017: What's New in Swift presentation, and try out some of the new features in this playground put together by Ole Begemann.