Linked by Thom Holwerda on Mon 29th Apr 2013 07:08 UTC
Amiga & AROS "As computer games became more and more complex in the late 1980s, the days of the individual developer seemed to be waning. For a young teenager sitting alone in his room, the dream of creating the next great game by himself was getting out of reach. Yet out of this dilemma these same kids invented a unique method of self-expression, something that would end up enduring longer than Commodore itself. In fact, it still exists today. This was the demo scene."
Thread beginning with comment 560021
To read all comments associated with this story, please click here.
Comment by MOS6510
by MOS6510 on Mon 29th Apr 2013 07:47 UTC
Member since:

A lot of demos are pretty cool, but just try to imagine on what limited hardware they run and they are just amazing.

Reply Score: 3

RE: Comment by MOS6510
by Kroc on Mon 29th Apr 2013 08:21 in reply to "Comment by MOS6510"
Kroc Member since:

And what more, that it isn't like a PowerPoint that throws some pre-rendered stuff together with some effects. The most amazing part of demos for me is that they can produce something so artful from very very technical code. If it were just a "slideshow" then artistic style would be a given since the time would be spent on the art and not the code, but a demo is almost all code and so it takes a herculean effort to bash metal that close and produce something artistic.

Reply Parent Score: 4

RE[2]: Comment by MOS6510
by Kochise on Mon 29th Apr 2013 08:47 in reply to "RE: Comment by MOS6510"
Kochise Member since:

Yup, read this, french demoscene on Atari ST, which was less hardware featured than an Amiga and thus needed more software work :


Reply Parent Score: 2

RE: Comment by MOS6510
by Doc Pain on Mon 29th Apr 2013 09:23 in reply to "Comment by MOS6510"
Doc Pain Member since:

A lot of demos are pretty cool, but just try to imagine on what limited hardware they run and they are just amazing.

But also consider that those who have been programming "limited" systems (think of C64, Amiga 500 and successors, the various Atari ST and pre-ST computers) could use simple tools. Tools that we cannot use anymore.

Simple task: put a pixel on the screen

Simple solution: write some data to a specific memory location, often in assembler (faster is better)

Also simple, but often less efficient: use a predefined command in the supplied programming language, like putpixel(xpos, ypos, color), in a higher-level language that developers today would already consider lowest-level (because assember does not exist)

Today's common solution: I cannot even describe it, because depending on the system, you will have to deal with a heap of abstraction layers, libraries and other conglomerates of code that add complexity and remove efficiency (or even the ability to do something
more efficient). It's not easy anymore.

Sometimes, when I need to program something simple which involves simple hardware, I face the situation that it's becoming complex and complicated. Having done the same thing 20 years ago, seeing me jump through hoops today doesn't look very efficient. For example, switch some relays via parallel port: just write specific data to that port, inport() and outport() are even supplied by the system's library, relying on BIOS calls. Today? Not even a parallel port! USB, converters, microcontrollers, firmware, connectors, adaptors... it's not fun anymore, and definitely not simple. :-(

That's why I'm still fascinated by today's demo scene, primarily due to the fact that it still exists. Having been a member of that scene myself in the past, it's nice to see that some stuff "experts" consider dead are still alive and kicking. Also the mentality of "we can do it better" (not in the sense of "better than you", but more like "more efficient", "faster", "in less bytes" or "with higher performant code") is worth being kept alive, especially in comparison to today's obsession with layers of abstraction, libraries, frameworks, stacked on top, with no real understanding of what's happening. Of course "working on bare metal" requires much more knowledge and experience than clicking around in some pre-chewed environment that makes use of frameworks that "take care of everything", and in the end, result in unmaintainable code, slow applications, errors, crashes, wrong results and bloat.

I still have my Amiga collection, as well as some Atari computers. Some day, I hope, I will use them for something interesting. Of course they are still working perfectly. It's not that they rot quickly as today's "modern" computers. :-)

Reply Parent Score: 5

RE[2]: Comment by MOS6510
by Kochise on Mon 29th Apr 2013 10:03 in reply to "RE: Comment by MOS6510"
Kochise Member since:
RE[2]: Comment by MOS6510
by bassbeast on Mon 29th Apr 2013 11:56 in reply to "RE: Comment by MOS6510"
bassbeast Member since:

The difference is back then you could ask for and get the full blueprints and opcodes for the chips and moreover it wasn't that hard to make a mental map of what the chip was doing because their designs were MUCH simpler, heck I remember reading the guys at Commodore used to build a working mockup for their chips using a big breadboard and a LOT of point to point wiring.

Now compare that with the chip I'm typing on which isn't even state of the art yet has 6 cores, 3 levels of cache, if you look at a diagram of a modern chip layout its simply too complex for using the simple solutions anymore. Heck even ARM which used to be all about simplicity is up to 6 cores and 64 bits, so while its possible to use ASM today the odds that you will be able to cook up better than the compiler is pretty slim unless you are a superbrain.

Reply Parent Score: 4