From Assembly Intelligence to Artificial Intelligence
My world view as a teenager was binary in at least 10 intertwined ways: all the code, obviously, and the fact that you were either elite or a “lamer”. I’d hazard a guess that this view was true for practically every demoscener at the time (mid-80s through early 90s). We conditioned each other and ourselves to wrap up all of our worth in one simple metric: how much impossible shit can you squeeze out of the machine?
The demoscene was about moving the most graphics around on the screen, in the craziest ways possible, while maintaining full framerate (50 or 60 frames per second depending on which continent you were on). It was about playing back samples (recorded audio) using sound chips that didn’t have that capability. It was about rendering graphics in full-screen, which couldn’t be done according to the hardware manufacturers. It was about playing three-note chords on a three-channel sound chip while simultaneously playing drums, bass, and a melody. It was about learning, and constantly re-learning, the many answers to the eternal question of whether to take the hit on the CPU or the RAM.
Whoever did the latest thing best, biggest and fastest was the most elite… at least for a few months until their record was broken by the next guy (there was an extreme lack of female representation).
Some people in the scene were the opposite. Lamers. It’s not a word I’d use to describe a human being today, but in the context of ancient times, it was someone who had a less than firm grasp of what they were doing, code-wise. Some were just hang-arounds. Some would talk big without being able to sling assembly. But then there was the worst kind: those who stole code from elites and passed it off as their own. When that type of thing was revealed, the scene would tear the perpetrator apart in the social media of the time: scrolling texts in demos, in which you could either be “greeted” (good) or called out as a lamer for all to see (bad).
The unwritten but very real rule was: know the machine and write the code yourself. Talk directly to the hardware. Count the clock cycles and optimize. If you’re using a library, it better be your own. Write ingenious code or you’re out. Oh, and don’t tell anyone outside your crew how you did it.
Great! So, we had under-documented hardware, zero help on the software side, nobody willing to share, and an understanding that we were useless unless we knew how to consistently force computers to do things they were never meant to do.
No way to learn, and an impossible mission.
Except… disassemblers existed. A disassembler is a piece of software that can turn machine code (what the computer’s processor understands) back into assembly code (what very, very sick people understand). As a mental shortcut, you can think of it as doing a View Source in your browser.
The second unwritten rule was: everyone looks. Each shipped demo was an invitation to try to figure out how it had been done, and the answers, by necessity, always accompanied each program. Load it up in the disassembler and work your brain until it melts. Train on what already exists. Learn from those who’d routinely lock away their knowledge for status retention. Hack all the things.
And this is how the scene evolved: by building on what came before. As with any tech curve, there were a few revolutions, but fewer than you might expect. Most of it was incremental. One more sprite on the screen. One more border removed (on the way to full-screen). A few more audio channels. But, crucially, you had to figure it out yourself and then figure out how to make it even better. It’s how credibility was earned alongside the opportunity to join more elite crews. Status and prestige rooted in nothing but knowledge.
The scene, though a tribal and semi-toxic nerd jungle, was the best teacher I ever had. It laid an exceptionally solid computer science foundation for a lot of people, many of whom launched some of our most significant gaming and software companies over the years.
Its logic was also completely broken. Taken to its end, we shouldn’t have been allowed to write demos in editors we didn’t make, in operating systems we didn’t write. Or on hardware we didn’t design and build ourselves. Also, best be generating your own electricity from your own power source. And don’t forget to create your own universe first, or you’re a lamer.
I carried this baggage a number of years into my engineering career. Write your own game engine or it’s not your game. Don’t use a library to draw your graphics; do it yourself. Write it in as low-level a language as possible. There are still certain strengths in this thinking from an academic angle, and, of course, it can be lots of fun rolling your own. It’s certainly educational.
But a company is not going to give you extra points for doing it the hard way.
As a coder and musician, I have lots of mixed feelings on where AI is now and where it’s likely to go. It’s amazing tech, and it will have amazing consequences (all ranging from positive to negative in multiple dimensions). I’m not going to try to predict what the world will look like once AI is allowed full access to its own source code and deployment pipeline, and can iterate on itself autonomously. That day will come, because someone will allow it, but right now, as an engineer, you cannot afford to ignore or even under-utilize AI.
As someone who still, after all these decades, gets a rush the first time my new program runs the way I intended, this is painful to say: it’s not so much about the code anymore, and soon, it won’t be about the code at all in 99% of dev scenarios. Shaping code will largely die. Most of us will need to transition to shaping outcomes: orchestrating, curating, and building trust and safeties. I don’t buy the notion that AI will just “change” engineering jobs. Some will vanish. Others may emerge.
For now, just like we’ve built abstraction upon abstraction to make things easier and faster throughout computing history, this is another layer. It’s your infinite library for your favorite programming language. It’s your pair programmer and rubber duck. It’s contemporary coding, and you have to adapt.
AI is not, however, just another tool, like some are saying. It’s also an agent, and that’s vastly different from what came before. The pace of foundation model development isn’t just exponential; it’s accelerating exponentially. It will auto-evolve soon enough and that’ll change things again. Until then, dig into it with all the curiosity of a kid with a disassembler in 1986.
I’m certain of one thing: we’re still trying to make impossible things happen with imperfect tools. We’re still in the game.