|
What is this all aboutAdvances over the past decade in many aspects of digital technology
has loosen up programmers' belt. We don't optimize as we used to.
Hell, I usually have impression we don't do that at all, any more.
Why should we care anyway, if:
- These days it's just cheaper to buy faster board than optimize
code and design.
- It became ennormously nightmarish to optimize on modern; superscalar
and VLIW designs. Currently everybody leaves the bloody job to the
compiler.
- In software industry, where customer pressure is high, where
minimal time-to-market delivery, high production cost and other economic
buzz-words take over control, optimization is a noman's land. Tracking
trends in USENET discussions leads to conclusion that DSP programmers
are only professionals left on planet earth that really do bother
to optimize. One could wonder for how long will it last yet.
- The thing that surely doesn't help are software patents. Although they were invented
to help society improve and innovate, paradoxically they work in
oposite direction. Some even claim it has become nearly impossible to use sophisticated
methods without infringing on patent. (Although this issue as serious
as it is, is totally ignored by most of scene coders.)
The current situation isn't at all bad for a day-to-day programming,
where productivity is all that matters. So why the heck should we
bother with optimizing at all? Because it's what the democoding is
all about. Let's clear things up, we code demos to:
-
Show off programmer's skills. How do you compare PC coders' skills
in making efficient code if the hardware area is so broad and progress
iterates at mad rate?
-
Demonstrate machine's capabilities. Even beyond the scope of
machine's hardware designer's imagination! That was always the magic
aspect which made me itch which needed scratching. "Push the wall
of impossibilities further" as is written in Roots's endscroll. I
think that not everything what those old machines are capable of
has been shown yet. Can it ever anyway?
There is only one true way to achive these objectives. Target
a fixed platform. History shows that the machine's scene is in bloom
when machine's a common platform.
- C64. Because of lack of options, they've choosen C64 with 1541
as a reference platform and it kept them happy (and still does).
The only noticable fragmentation was caused by varing SID versions.
Result? The code of C64 demos is pure wizardry, where code is often
tweaked to every single CPU cycle!
- Amiga. When A500 with memory expansion to 1MB was on sceners'
desks, scene was flooded with masses of demos. Advanced algorithms
was developed, sophisticated methods invented. Then came AGA machines,
and shortly after, hardware (mainly CPU) add-ons. Fragmentation began.
Although compared to PC's it was nearly non-existent, so it helped
to keep Amiga in the winner's position for the quite a lenghty period
of time. These days even some guys want to force everyone to program
demos for gfx cards. No wonder, amiga scene is in the state of agony.
- Consoles. Poor expandibility of consoles would make them a perfect
hardware for scene coding. Although ultra high cost of quality developement
tools stoped it from happening. This is changing a bit lately, as
there appeared some third party semi-free tools. However lack of
developer's attributes (keyboard & writable mass storage media)
as a standard is still a barrier preventing it to reach a critical
mass.
As we learn from the past, fixed platforms bring out an essence
of democoding. We can get back there only by writing for the hardware
which is the lowest common denominator in given niche, which simply
means A500/1MB ram. I think that by rejecting the culture of instant
upgrade in the scene domain, we can push the old spirit into it.
We are in that comfortable position that we don't have to deal with
as much costs (read our time is to be wasted.) After all we're doing
everything here just for the fun of it, aren't we? I hope this document
will help push the style of amiga scene developement back on the
old track.
|