Classic Amiga assembler tutorial

If you want to write Assembly programs for the Amiga you can either work directly on a real system or use a cross-compiler. I prefer to work on my Linux system because, as much as I like retro architectures, I also like the power of a good Unix system and a modern editor.

Cross-compiling is a very simple concept: instead of compiling source code and creating binaries for the architecture you are running the compiler on, you create binaries for a different architecture. In this case the host architecture is Linux/amd64 and the target architecture is Amiga.

As this is not the only project I am following at the moment, I created a directory to host everything I need for the Amiga development: compiler, documentation, scripts.

Exploring assembly on the Amiga, part 1, part 2, and part 3.

The US net neutrality repeal is official.

It’s official. The Federal Communications Commission's repeal of net neutrality rules, which had required internet service providers to offer equal access to all web content, took effect on Monday.

The rules, enacted by the administration of President Barack Obama in 2015, prohibited internet providers from charging more for certain content or from giving preferential treatment to certain websites.

Great news. This will enable honest, trustworthy, transparant, and customer-focused companies like Comcast to take control of the internet. This can only mean good things for American consumers, and will ensure that they remain free of the confusing and heavy burden of ISP choice. In turn, the "market" will remain carved up by at best two large monopolies, which is clearly the best type of market in the universe.

It’s 2018 and USB Type-C is still a mess

USB Type-C was billed as the solution for all our future cable needs, unifying power and data delivery with display and audio connectivity, and ushering in an age of the one-size-fits-all cable. Unfortunately for those already invested in the USB Type-C ecosystem, which is anyone who has bought a flagship phone in the past couple of years, the standard has probably failed to live up to the promises.

Other than my Nintendo Switch, my back-up phone (a Galaxy S8), and my old Nexus 6P in storage somewhere, I don't use USB-C at all, so I've been able to avoid all of its problems so far. It seems like a real mess.

The 640K memory limit of MS-DOS

At the beginning of the '90s, the PC platform was often mocked by its rivals. Of course, PCs were much more powerful than, say, an Amiga 500. But the Amiga offered a flat memory address, while a DOS program could only access memory using cumbersome 64 KiB segments. And to add insult to injury, there was this strange 640 KiB memory limitation. No matter how much physical memory you had in your box, the utter most important Conventional Memory was limited to 640 KiB!

The Legend teaches us that Bill Gates once declared that "640 KB ought to be enough for anybody", then designed MS-DOS to enforce this limitation.

The truth is of course a little more complicated than that.

This article brings back so many confusing childhood memories of MS-DOS and memory management - memories I wouldn't wish on my biggest enemies. All kidding aside, this is a great insight into how memory is organised in MS-DOS.

How Android engineers are winning the war on fragmentation

With the launch of Android 8.0 last year, Google released Project Treble into the world. Treble was one of Android's biggest engineering projects ever, modularizing the Android operating system away from the hardware and greatly reducing the amount of work needed to update a device. The goal here is nothing short of fixing Android's continual fragmentation problem, and now, six months later, it seems like the plan is actually working.

There are indeed some small signs of hope, but the reality is that as long as Samsung isn't on board, it's effectively all for naught. I find this article far too positive when you look at the reality of Android updates, but at least there's some progress.

The land before binary

The IRS has a lot of mainframes. And as millions of Americans recently found out, many of them are quite old. So as I wandered about meeting different types of engineers and chatting about their day-to-day blockers I started to learn much more about how these machines worked. It was a fascinating rabbit hole that exposed me to things like "decimal machines" and "2 out of 5 code". It revealed something to me that I had not ever considered:

Computers did not always use binary code.

Computers did not always use base 2. Computers did not always operate on just an on/off value. There were other things, different things that were tried and eventually abandoned. Some of them predating the advent of electronics itself.

Here's a little taste of some of those systems.

I've often wondered why computers are binary to begin with, but it was one of those stray questions that sometimes pops up in your head while you're waiting for your coffee or while driving, only to then rapidly disappear.

I have an answer now, but I don't really understand any of this.

Haiku May monthly activity report

Haiku's latest monthly activity report is out, and it contains a lot of interesting points of progress. Since I can't highlight them all, here's one that I think is vital.

Korli continued his work on 32-bit applications support for x86_64. He now has most of the binary-loading, commpage, signals, and syscall system changes merged, though there are still a lot of pending changes to fix individual syscalls and then start applications in 32-bit mode.

There's also a major new port: LibreOffice has been ported to Haiku.

Apple and Google are heading in the same direction

But I think the reason that this year's WWDC felt a little Googley is that both companies are trying to articulate a vision of computing that mixes AI, mobile apps, and the desktop. They're clearly heading in the same general direction.

It's becoming ever harder to distinguish the two companies. They are clearly trying to work towards the same future, but they're coming at it from different directions. It's fascinating to watch.

Intel “forgot” to mention 28 core, 5 GHz demo was overclocked

Intel's recent demonstration of a 28-core processor running at 5GHz has certainly stirred the pot here at Computex, particularly because the presentation appeared to imply this would be a shipping chip with a 5.0GHz stock speed. Unfortunately, it turns out that Intel overclocked the 28-core processor to such an extreme that it required a one-horsepower industrial water chiller. That means it took an incredibly expensive (not to mention extreme) setup to pull off the demo. You definitely won't find this type of setup on a normal desktop PC.

We met with the company last night, and while Intel didn't provide many details, a company representative explained to us that "in the excitement of the moment," the company merely "forgot" to tell the crowd that it had overclocked the system. Intel also said it isn't targeting the gaming crowd with the new chip.

A lot of people always say "CEO's and companies don't lie because that's illegal, so you can always believe them".

Yeah.

ARM Holdings history: from Acorn to giant tree

The computer industry is full of noble failures. Big ones. Little ones. Ideas that were 10 years too early. Ideas that were 15 years too early. Ideas that were 30 years too early. And concepts that, while fundamental to the way that our computing culture works today, hadn’t yet reached their full potential. Though certainly successful in its early years, the ARM processor very much fits in the latter category. Today, variants of these processors are in just about everything, from tiny computers, to smartphones, to video game consoles, to television sets, and even some servers. But the company that initially forged the processor is almost forgotten at this point, seemingly lost to history (especially outside of Europe) despite being an early icon of British computing. Tonight's Tedium ponders the story of Acorn Computers, the long-departed company whose best idea is probably in the device you're using to read this.

This introduction is basically clickbait specifically designed for OSNews readers. Well done.

Ubisoft CEO: cloud will replace consoles after next generation

Better start saving up for that PlayStation 5, Xbox Two, or Nintendo Swatch (that last follow-up name idea is a freebie, by the way). That generation of consoles might be the last one ever, according to Ubisoft CEO Yves Guillemot. After that, he predicts cheap local boxes could provide easier access to ever-evolving high-end gaming streamed to the masses from cloud-based servers.

I think that's a little optimistic, but the trend is clear.

AI at Google: our principles

Sundar Pichai has outlined the rules the company will follow when it comes to the development and application of AI.

We recognize that such powerful technology raises equally powerful questions about its use. How AI is developed and used will have a significant impact on society for many years to come. As a leader in AI, we feel a deep responsibility to get this right. So today, we’re announcing seven principles to guide our work going forward. These are not theoretical concepts; they are concrete standards that will actively govern our research and product development and will impact our business decisions.

We acknowledge that this area is dynamic and evolving, and we will approach our work with humility, a commitment to internal and external engagement, and a willingness to adapt our approach as we learn over time.

It honestly blows my mind that we've already reached the point where we need to set rules for the development of artificial intelligence, and it blows my mind even more that we seem to have to rely on corporations self-regulating - which effectively means there are no rules at all. For now it feels like "artificial intelligence" isn't really intelligence in the sense of what humans and some other animals display, but once algorithms and computers start learning about more than jut identifying dog pictures or mimicking human voice inflections, things might snowball a lot quicker than we expect.

AI is clearly way beyond my comfort zone, and I find it very difficult to properly ascertain the risks involved. For once, I'd like society and governments to be on top of a technological development instead of discovering after the fact that we let it all go horribly wrong.

Microsoft to possibly offer a “Switch to S Mode”

Windows Insider Preview build 17686 includes a hint that Microsoft may soon allow users to "switch to S mode". If true, the software giant may finally reverse on of the worst design decisions in Windows history.

You can see this hint by opening the Settings app and typing S mode. As you can see in the shot above, Settings provides a search hint for a Settings interface called Switch to S Mode.

I would definitely use this switch; I pretty much run only Store Applications on my Surface Pro 4 anyway, and an easy switch to allow classic Win32 applications if the need arises seems useful.

How people used to download games from the radio

An anonymous user sent this one in, and even though it's old - 2014 - I hadn't read it yet, and I don't think it's ever been posted here.

It's a Monday night in Bristol in July 1983. Your parents are downstairs watching Coronation Street while you skulk in your bedroom under the pretence of doing homework. In reality, you're hunched over your cassette recorder, fingers hovering over the buttons in feverish anticipation. A quiver of excitement runs through you as a voice from the radio announces: "and now the moment you've all been waiting for..." There's a satisfying clunk as you press down on play and record simultaneously, and moments later the room is filled with strange metallic squawks and crackles. "SCREEEEEEEEEEE..."

You're listening to the Datarama show on Radio West and partaking in the UK's first attempt to send a computer program over local radio. Joe Tozer, who co-hosted the show, recalls how it all began: "I think it was just one of those 'ping!' moments when you realise that the home computer program is just audio on a cassette, so why not transmit it over air? It just seemed a cool idea."

I have very little experience with using cassettes as a data storage medium, except for that one time, somewhere in the late '80s or early '90s, where a neighbour kid and I loaded Rambo for the C64 from a cassette tape. That's the only time I ever did such a thing, and in hindsight, I'm glad I got to experience this era of computing, even if it was only once.

The future of the Mac comes from iOS apps

Apple made a big splash at WWDC this year when it announced that it would be letting developers port their iOS applications over to the Mac sometime next year - and that Apple had already started the process by bringing over the iOS versions of the Home, Stocks, News, and Voice Memo apps to macOS 10.14 Mojave.

The project - rumored to be codenamed Marzipan - is still in the early stages, and Apple isn't even planning on offering it to developers until 2019. And there's already a fair amount of confusion and outcry over what Apple's doing here: whether or not it will see the death of the traditional Mac app as we know it, exactly how these new kinds of apps will work, whether they'll feel like traditional "native" Mac apps, and even whether or not it's fair to call these apps "ports". So here's what's actually going on.

A fair overview of "Marzipan" and what it could mean for the future of the Mac.

ReactOS GSoC: booting from Btrfs

ReactOS has unveiled its Google Summer of Code project, undertaken by Victor Perevertki.

My project is both simple and complicated. I want to add to ReactOS an option to install on and boot from BTRFS partitions. There are a few little things left to implement this:

  • BTRFS support in bootloader.
  • Fixes in cache controller and memory manager in order to boot with WinBtrfs driver. It is getting better every week, but right now used only with fastfat driver for FAT32.

My primary goal for this internship is implement BTRFS support in FreeLdr - our bootloader.

Another great GSoC project to keep an eye on.

Microsoft announces Visual Studio 2019

In a blog post, Microsoft announced Visual Studio 2019.

Because the Developer Tools teams (especially .NET and Roslyn) do so much work in GitHub, you'll start to see check-ins that indicate that we're laying the foundation for Visual Studio 2019, and we're now in the early planning phase of Visual Studio 2019 and Visual Studio for Mac. We remain committed to making Visual Studio faster, more reliable, more productive for individuals and teams, easier to use, and easier to get started with. Expect more and better refactorings, better navigation, more capabilities in the debugger, faster solution load, and faster builds. But also expect us to continue to explore how connected capabilities like Live Share can enable developers to collaborate in real time from across the world and how we can make cloud scenarios like working with online source repositories more seamless. Expect us to push the boundaries of individual and team productivity with capabilities like IntelliCode, where Visual Studio can use Azure to train and deliver AI-powered assistance into the IDE.

Our goal with this next release is to make it a simple, easy upgrade for everyone - for example, Visual Studio 2019 previews will install side by side with Visual Studio 2017 and won't require a major operating system upgrade.

The company doesn't have a release date yet.

AMD reveals Threadripper 2: up to 32 cores

At the AMD press event at Computex, it was revealed that these new processors would have up to 32 cores in total, mirroring the 32-core versions of EPYC. On EPYC, those processors have four active dies, with eight active cores on each die (four for each CCX). On EPYC however, there are eight memory channels, and AMD's X399 platform only has support for four channels. For the first generation this meant that each of the two active die would have two memory channels attached - in the second generation Threadripper this is still the case: the two now 'active' parts of the chip do not have direct memory access.

I feel like the battle for the highest core count at the lowest possible price while still maintaining individual core clock is really the new focus for Intel and AMD. My only hope is that this will spur better and easier parallelisation in software so that we can all benefit from this battle.

How do iOS Screen Time and Android Digital Wellbeing stack up?

Developer conference season is coming to an end with Apple's WWDC this week, and the main takeaway is that between Google's "Digital Wellbeing" and Apple's "Screen Time", the two biggest smartphone developers are taking some time to discourage smartphone overuse.

On the surface, the two companies are taking very similar approaches with the tools they're offering to present information to users. Apple and Google are both adding new dashboards, with options for more zoomed-out perspectives on how you're spending your time, along with more granular views of how often you're using individual apps - down to the minute. There's data on how many notifications you've received, where they're coming from, and breakdowns of when you're actually on your phone.

I like these features. I don't really need them - I don't even use my phone all that much - but I do like that they give me insight into how long I use certain applications, how often I pick up my phone, and so on. Neat data to have.

AirPods to get Live Listen feature in iOS 12

Apple has one hardware-specific feature planned that wasn't announced at Monday's WWDC keynote. In iOS 12, users will be able to use Live Listen, a special feature previously reserved for hearing aids certified through Apple's Made for iPhone hearing aid program, with their AirPods.

After enabling the feature in the iPhone's settings, users will be able to use their phones effectively as a directional mic. This means you can have AirPods in at a noisy restaurant with your iPhone on the table, for example, and the voice of whomever is speaking will be routed to your AirPods.

What a great accessibility feature for people with hearing problems.