General Development Archive

What’s the difference between an integer and a pointer?

In an assembly language we typically don't have to worry very much about the distinction between pointers and integers. Some instructions happen to generate addresses whereas others behave arithmetically, but underneath there's a single data type: bitvectors. At the opposite end of the PL spectrum, a high-level language won't offer opportunities for pointer/integer confusion because the abstractions are completely firewalled off from each other. Also, of course, a high-level language may choose not to expose anything that resembles a pointer.

LLVM 7.0.0 released

The release contains the work on trunk up to SVN revision 338536 plus work on the release branch. It is the result of the community's work over the past six months, including: function multiversioning in Clang with the 'target' attribute for ELF-based x86/x86_64 targets, improved PCH support in clang-cl, preliminary DWARF v5 support, basic support for OpenMP 4.5 offloading to NVPTX, OpenCL C++ support, MSan, X-Ray and libFuzzer support for FreeBSD, early UBSan, X-Ray and libFuzzer support for OpenBSD, UBSan checks for implicit conversions, many long-tail compatibility issues fixed in lld which is now production ready for ELF, COFF and MinGW, new tools llvm-exegesis, llvm-mca and diagtool. And as usual, many optimizations, improved diagnostics, and bug fixes.

The release notes have all the details.

On the road to pure Go X11 GUIs

And so I've placed a bet on Go. It is just as conceptually simple as C, sports a friendly BSD-style license, and already has its own parallel ecosystem. No stinky LLVM, in fact no traces of C at all! It's an overlooked revolution! I can follow symbols through packages however deep I want to and I always end up in Go or its assembly. Well, so long as nothing ugly uses Cgo.

Right, now that I've embraced the garbage collector, how do I make an interface that doesn't look like it dates back to the '80s? And can I avoid Cgo?

Learning BASIC like it’s 1983

Now, of course, I tell computers what to do for a living. All the same, I can't help feeling that I missed out on some fundamental insight afforded only to those that grew up programming simpler computers. What would it have been like to encounter computers for the first time in the early 1980s? How would that have been different from the experience of using a computer today?

This post is going to be a little different from the usual Two-Bit History post because I'm going to try to imagine an answer to these questions.

This is a great idea.

x86-64 assembly language programming with Ubuntu

The purpose of this text is to provide a reference for University level assembly language and systems programming courses. Specifically, this text addresses the x86-64 instruction set for the popular x86-64 class of processors using the Ubuntu 64-bit Operating System (OS). While the provided code and various examples should work under any Linux-based 64-bit OS, they have only been tested under Ubuntu 14.04 LTS (64-bit).

Your light reading for the weekend.

Dart 2.0 released

Coming from Dart 1, there's two major developer-facing changes, the largest of which is a stronger type system, including runtime checks to help catch errors that would arise from mismatched or incorrectly labeled types. This type system, originally called "strong mode", has long been the default in Flutter. The other is an interesting quality-of-life change for Flutter developers, which allows creating an instance of a class without the "new" keyword. The goal of this change is to make Flutter code more readable, less clunky, and easier to type, but the principle applies to all Dart code.

The complete list of changes has all the details.

Hello world on z/OS

If you've followed any one of the amazing tutorials on how to set up a mainframe on a conventional personal computer, you've probably noticed they end with the login screen as if everything beyond that point will be intuitive and self-explanatory to newbies. I mean... That was my assumption going into this project. I'll figure it out. How hard could it be? Maybe it would take me a few hours. Maybe I'd have to Google some stuff... Read some documentation...

It took me over a week.

Over a week to figure out enough to compile and run a basic program.

Why Discord is sticking with React Native

Looking back at the past three years, React Native has proven to be extremely successful at Discord and helped drive our iOS user adoption from zero to millions!

More specifically, React Native has allowed us to reap the benefits of quickly leveraging reusable code across platforms, as well as develop a small and mighty team.

Meanwhile, we've learned to adapt to its inevitable pain points without sacrificing overall productivity.

We all complain a lot about these non-native, cross-platform frameworks, but it's only fair to also highlight the other side of the coin - in this case, the view from the developers of an incredibly popular application who need to easily support multiple platforms.

Small computer system supports large-scale multi-user APL

Another article from a very much bygone era - we're talking 1977, and for sure this one's a bit over my head. I like being honest.

APL (A Programming Language) is an interactive language that allows access to the full power of a large computer while maintaining a user interface as friendly as a desktop calculator. APL is based on a notation developed by Dr. Kenneth Iverson of IBM Corporation over a decade ago, and has been growing in popularity in both the business and scientific community. The popularity of APL stems from its powerful primitive operations and data structures, coupled with its ease of programming and debugging.

Most versions of APL to date have been on large and therefore expensive computers. Because of the expense involved in owning a computer large enough to run APL, most of the use of APL outside of IBM has been through commercial timesharing companies. The introduction of APL 3000 marks the first time a large-machine APL has been available on a small computer. APL 3000 is a combination of software for the HP 3000 Series II Computer System2 and a CRT terminal, the HP 2641A, that displays the special symbols used in APL.

Enjoy.

The land before binary

The IRS has a lot of mainframes. And as millions of Americans recently found out, many of them are quite old. So as I wandered about meeting different types of engineers and chatting about their day-to-day blockers I started to learn much more about how these machines worked. It was a fascinating rabbit hole that exposed me to things like "decimal machines" and "2 out of 5 code". It revealed something to me that I had not ever considered:

Computers did not always use binary code.

Computers did not always use base 2. Computers did not always operate on just an on/off value. There were other things, different things that were tried and eventually abandoned. Some of them predating the advent of electronics itself.

Here's a little taste of some of those systems.

I've often wondered why computers are binary to begin with, but it was one of those stray questions that sometimes pops up in your head while you're waiting for your coffee or while driving, only to then rapidly disappear.

I have an answer now, but I don't really understand any of this.

UTC is enough for everyone… Right?

Programming time, dates, timezones, recurring events, leap seconds... Everything is pretty terrible.

The common refrain in the industry is Just use UTC! Just use UTC! And that's correct... Sort of. But if you're stuck building software that deals with time, there's so much more to consider.

It's time... To talk about time.

This is one of the best articles - experiences? - I've ever read. It's funny, well-written, deeply informative, and covers everything from programming with time, to time and UI design, to time and accessibility. This is simply an amazing piece of work.

C is not a low-level language

In the wake of the recent Meltdown and Spectre vulnerabilities, it's worth spending some time looking at root causes. Both of these vulnerabilities involved processors speculatively executing instructions past some kind of access check and allowing the attacker to observe the results via a side channel. The features that led to these vulnerabilities, along with several others, were added to let C programmers continue to believe they were programming in a low-level language, when this hasn't been the case for decades.

Processor vendors are not alone in this. Those of us working on C/C++ compilers have also participated.

The soon-to-be-extinct embedded software engineer

Embedded systems have started to become extremely complex. The big push to connect every device to the internet to create the IoT is causing a demand for embedded software engineers that has not yet been seen in recent history. This big push is causing a vacuum in which companies can't find enough embedded software engineers. Instead of training new engineers, they are starting to rely on application developers, who have experience with Windows applications or mobile devices, to develop their real-time embedded software. The problem, of course, is that these engineers don't understand the low-level hardware, but only high-level application frameworks that do all the work for them.

Is this actually true? It's very difficult to gauge this, since most focus when it comes to development is on "sexy" development, such as smartphone applications or websites - there's very little media visibility for lower-level engineering such as embedded developers, kernel engineers, and so on. Since I know how easy it is to fall into the trap of believing that everything was better in the past, I genuinely wonder if this is really actually a problem, or that we just perceive it as such.

The desktop belongs to Electron

This doesn’t have to be forever. Maybe in the future, developers will start using React Native to build desktop applications. Or perhaps Flutter! Electron apps have a bad reputation for using too much RAM, have potential security issues, can’t (yet) match the speed of C++, and they often lack the polish and familiarity of a great native app.

But it seems clear to me that OS-specific SDKs are becoming a liability for desktop OS vendors. Developers want to use the technologies they know, and they want maximum reach for the products they build. And they’re smart enough to get what they want. A lack of cooperation on the part of Apple, Google, and Microsoft will only hurt users.

Say hello to your new Electron overlord.

At 33, I'm perhaps staring to show signs of becoming an old man, but I really don't like Electron applications. I use Discord every day, and it just feels slow, cumbersome, and out of place on my virtually 100% Modern/Fluent Design Windows desktop, Surface, and my iPhone X. I greatly prefer proper, platform-specific native applications, but I feel that ship may have sailed with things like Electron and Progressive Web Apps.

I'm not looking forward to this future.

The story of ispc

I've decided to write up a little history of ispc, the compiler I wrote when I was at Intel. There's a lot to say, so it'll come out in a series of posts over the next few weeks. While I've tried to get all the details right and properly credit people, this is all from my memory. For anyone who was around at the time, please send an email if you see any factual errors.

The above links to the first part in the series - there's a table of contents for the entire series.

A constructive look at the Atari 2600 BASIC cartridge

Honestly, I don't think the Atari 2600 BASIC has ever had a fair review. It's pretty much reviled as a horrible program, a horrible programming environment and practically useless. But I think that's selling it short. Yes, it's bad (and I'll get to that in a bit), but in using it for the past few days, there are some impressive features on a system where the RAM can't hold a full Tweet and half the CPU time is spent Racing The Beam. I'll get the bad out of the way first.

Input comes from the Atari Keypads, dual 12-button keypads. If that weren't bad enough, I'm using my keyboard as an emulated pair of Atari Keypads, where I have to keep this image open at all times.

Okay, here's how this works.

An older story - it's from 2015 - but fascinating nonetheless.