'Forgotten Places' began as a feeling, chord progression and an idea to make an audio visual project about the lost places in our built environment and the beauty of nature starting to take hold again. For me these sorts of places have always been important for some time away from the city grind when there’s no time for escaping the city. There’s something special about unkept spaces and the form of complexity that emerges along with a reassuring feeling that nature will persist beyond us.
I chatted to Kathrin about the ideas and she made an on the spot improvisation for the vocals in the way that only she can, full of expression. I then started looking for imagery of the idea and found Jonk’s amazing photographs of abandoned spaces. He was interested in experimenting with how we could use his work as the basis for a music video so the next challenge was how to turn stills into moving image.
The next stage of the project happened with AI art specialist Xander Steenbrugge. In the era when there are often issues around ownership and crediting of AI generated art I was interested in how we could base a project on one artist as the source, Jonk. Xander created predictions of frames linked to the photographs so sometimes we see the spaces before they were abandoned or interpolations between them.
Finally, I started chatting to Ukrainian artist Nick Motion about how we could take all of the stills and downstream generated video and combine it with his style of particle work and editing to yield the final audio-synced sequence. So overall it’s a collection of different human and machine approaches as an experiment, to hopefully celebrate the photography of Jonk and capture some of the feeling and merit of our forgotten places.
Every project is a journey. And Forgotten Places is such a special one.
For me it was fun deep dive into new software, coding and mathematical concepts. I tried to unleash forces of randomness to create a memorable visual narrative. My main challenge was to create an empty container, interesting enough visually for viewers to interact with. Almost a simulation of a narrative.
I believe in the ultimate power of abstraction and chaos. Because it enables the viewer to participate in an act of art. You can only see things you already know, feel, and experience. If you listen carefully you`ll find out — the whole world is a reflection of yourself, and it can't be different.
This was my first experience working with AI closely. Thanks to the developers community I have access to useful AI models, without leaving a native work environment. This is amazing how far we can get in the simulation of reality. This gives the opportunity to discover interesting parallels between nature and manmade structures. both physical and abstract
Max Cooper - Exotic Contents
I have been exploring the difficulties of communicating with words as part of my new Unspoken Words album project. I was thinking about how to visualise the idea, and started chatting to artist and machine learning specialist Xander Steenbrugge about a system for converting words to visual stories. The idea was to take the writings of Ludwig Wittgenstein, who tackled this question of the difficulties of using words to explain our selves and our place in the world, and to have the AI system re-interpret these writings in visual form, making a direct visual representation of the Unspoken Words album concept.
It's interesting for me to see the incomprehensible philosophical language interpreted visually like this, full of symbolism and the boundaries between language, our selves, and the world, broken down into flowing abstraction. I haven't really taken it all in yet, I feel like there's more to discover in it than I can appreciate. The system has a lot more potential too, we will be running a version where you can have your own words turned into visuals and music as well, keep an eye out on my socials and mailers for that if you'd like to get involved.
The music was a similar departure from my usual interpretation of ideas, where I applied a half time drum and bass format with more aggression and sharpness of sound design. It’s a club track, but for me, an exotic form of club track. The name came from Wittgenstein’s private language argument, and the difficulty of communicating an entirely internal, subjective object to others when we can never directly show the object to each other. He uses the analogy of a beetle in a box, and I refer to the idea instead as our exotic contents.
From the very start of this collaboration, Max and myself were very eager to leverage
the latest advances in AI to directly visualize language through the mind of the machine.
For the past few months, in fact, I've been exploring this approach using a combination
of two machine learning models: a generative system called VQGAN, which can generate pixels
starting from a blank canvas, and a perceptual system called CLIP that guides this generator
by scoring how well the generated imagery matches a given language prompt. Through this
feedback loop, the system can be used to directly convert language into the visual domain.
After an early phase of experimentation with this technique, we quickly settled on Wittgenstein's
Tractatus as a great narrative for this technique as it deals directly with the relationship between
language and reality and aims to define the limits of science.
This final video then, is a direct visualisation of many of the statements posed by Wittgenstein in
his seminal work, as interpreted by the AI.