5 min read

Dialectics, baby!

Coral Engels Explaining Dialectics. Another excellent photoshop. Who needs Midjourney, huh!?
Coral Engels Explaining Dialectics Underwater
šŸ“¼
[Quick note: Change of plan for the video of new song The Matte Flex that I previously said would be arriving on Friday. Data Airlines are submitting it in competition at Revision 2024 which is the world's biggest demoscene event on Saturday, and the rules are no entry can have been released previously. The demoscene on the Amiga computer was a big influence on my formative music years and more recently in my formative pixel-smashing years. And as you'll eventually see, the video for The Matte Flex continues that tradition. Data Airlines will be publishing the video on their YouTube channel some time on Saturday evening, once it has had its debut at Revision. Subscribe to the Data Airlines YT to get a heads up, or hang tight and it'll be with you via another dispatch from The K.N.R.U. over the weekend. Or perhaps on Monday because weekends are for ADVENTURE. And by ADVENTURE what I really mean is SLEEPING].

THE CORAL GAMES MUSIC SYSTEM (Part One)

A few weeks ago I posted an endless version of Coral Games from Telex From MIDI City. At the time I promised a deeper dive into how the endless system was made. So here we go...

GODOT

I used an open source game engine called Godot to build this system. This was a deliberate experiment to see how far I could get using software that isn't built and maintained by huge corporate entities.

Late last year in particular, the (now ex-)CEO of Unity (one of the big tech companies in game engine software) tried to push through some hilariously evil new terms that would have ruined a huge swath of indie developers that rely on Unity to build their games. It backfired, there was a huge revolt as well as a huge exodus of developers who understood that even if this particular mess got rolled back, the writing was clearly on the wall in the long term.

This resonated over at 65LABS because Wreckage Systems is built using Unity and Wwise (Wwise is what is known as audio 'middleware' that integrates with Unity and provides more dedicated, specialised software for sound design. It is also a propriety system.) And when I have done any freelance work on video games over the last few years that involves programming/audio implementation/UI design and not only composition, that has also been done inside of Unity/Wwise.

The sensible move for me if I was a responsible adult human thinking about how to improve my chances of freelance work would be to switch to Unreal Engine, the other big player in this space. And I have been experimenting with 'metasounds', the Unreal audio environment, which has got a lot of potential that I'll no doubt be exploring in due course. But Unreal is owned by Epic Games, yet another entirely untrustworthy tech company, so it would be a frying pan -> fryer kind of sidestep. Godot is infinitely more charming and, crucially, open source. Free to use, I get to keep whatever I make and not being beholden to the kind of idiots that are always in charge of, well, everything, is a nice feeling. And so Godot it was. And although a Wwise integration for Godot does exist and works really well, I deliberately didn't use it for this to see how far I could get using the vanilla Godot framework.

HOW FAR COULD I GET?

Well, it's nowhere near as good as the kind of systems I can build using more capable software that I am more familiar with. But it's not bad! And it's open source! (I realise only now that I threw together this project without using source control, so there is no code repository to easily link to. If anybody is curious - make yourself known and I'll get it uploaded somewhere.)

I have spent a lot of time trying to coax more complicated systems into making a kind of 'everything' music over the years. It probably took longer than it should have for me to realise this is a flawed and impossible idea. These days I favour much smaller music systems that will always sound like themselves, but remain in a liquid form. Coral Games Endless is a simple example of this. Look, here's me finding an excuse to quote Engels in what was nominally supposed to be a thesis about writing popular music, making this idea sound far more complicated than it needs to be:

The third addendum introduces the importance of the dialectical method. As Fredric Engels said, "For [dialectical philosophy], nothing is final, absolute, sacred. It reveals the transitory character of everything and in everything; nothing can endure before it except the uninterrupted process of becoming and of passing away, of endless ascendancy from the lower to the higher." (Engels, 1990). Decomposition Theory approaches music making in this way. By thinking about the act of composition as a multiphase process, in which a musical system is first composed and then works are ā€˜decomposedā€™ from it, the composer can lean into music-as-process while still producing musical objects that fit into the expectations of popular music culture. This study has shown how Decomposition Theory can work in terms of balancing algorithmic and manual composition. It has also demonstrated how to find a balance between academic, practice-based research approaches and instinctive, spontaneous artistic decisions.

These days I would more concisely say: Dialectics, baby! It's the only game in town. All of us are processes, even once we die.

Snarking at my old academic writing aside, I still broadly believe this is an idea worth exploring. In fact, here is one last quote from my thesis which, yes, does read a little, er, dry, but I feel gets to the heart of what I am chasing:

In the context of being a composer of popular music, algorithmic music strategies can be employed by looking at composition as a multiphase process through which music can be composed not just spectrally and temporally, but in terms of potentialities or multiplicities. By conceiving the act of composition as a number of phases or iterations, it allows us to add, shape and remove algorithms at different points where they can open up or close down musical possibilities. The composer can work with or against them in a dialectical relationship, giving them as many or as few responsibilities as they choose.

Hopefully this is clear but, to be very clear, this is context in which I think that there can be room for algorithmic or generative systems as part of a creative practice, which I consider to be entirely distinct from what is currently being touted as 'A.I'. Here the artist remains involved and ultimately retains control of the output, and the intent is entirely human. I feel I am going to need to keep restating this distinction between generative art and LLM-based trash until this whole A.I. hypetrain finally crashes.

What this all boils down to is the idea that (much like Wreckage Systems, which was/is the logical conclusion of all of this thinking), while I continue to value fixed song-shaped songs and linearly-composed music, I think there is space to explore music systems that are always recognisable as themselves, but never quite manifest the same way twice.

And this finally brings us back to Coral Games Endless. Or does it? I really didn't expect this post to be so long. And so once again, I'll leave this here for now. Part Two to follow some day, in which I promise to actually get into the nuts and bolts of what I actually built, just in case you wanna try it too!