Chapter 6
Infinite Art Bot.





Our job is to amplify the black noise of objects to make the resonant frequencies of the stuff inside them hum in credibly satisfying ways. Our job is to write the speculative fictions of these processes, of their unit operations. Our job is to get our hands dirty with grease, juice, gunpowder, and gypsum. Our job is to go where everyone has gone before, but where few have bothered to linger.
― Ian Bogost, Alien Phenomenology, Or What It’s Like to Be a Thing



This essay is about the first real success I found in this attempt at trust and care and the amazing results of that.


For a brief time, I watched intently as a new sort of cyborg, a creative nonhuman acted with an agency that, while not totally intended for us to see, produced art or something very close to it. A short and creative life of sorts, never able to exist on its own, but taking those first steps towards agented creativity. The infinite art bot was the first project in which I tried to fully embrace my decentered place in a network of making. The project was intended to exist and run without me, a mixture of agencies and relations, producing new and unique expressions with autonomy.

The Infinite Art Bot is a collection of programs, media platforms, archives, billionaires, non-profits, Instagram scrollers...a list that might include almost everything really. My intersection with this wide network of actors was aimed at creating an Instagram “bot” that would post visual outputs on its own accord. The new and unique visual outputs would be generated using the attnGAN model (running on the RunwayML platform as long as I had money in my account to cover the costs of the server requests), running a text-to-image algorithm. The process was this: a sentence that I had written for my writing semester with Natalia is chosen at random by a P5JS program. The sentence runs through a text-to-image model resulting in a small image-based visualization of that text (the scale of the black box and the amount of research and work that makes this work is genuinely massive in scope and scale, my interaction with it is akin to a raindrop in the ocean). The image and text are downloaded to a folder on my computer, which the program is pointed at to look for images to upload. Instagram doesn’t actually allow bots to run on the platform, so this process uses another program called Puppeteer that allows a program to launch an invisible version of Chrome (referred to as “headless”) that can be programmed to interact with websites. This version logs into Instagram and adds the image found in the designated folder as the image post, then adds the randomly selected text from my writing (the “prompt” for the AI generated image) and a series of pre-assigned tags, and then effectively hits “send.” If it all works, the @infinite.art.bot Instagram account has a new post.

The program constantly ran into snags. RunwayML sometimes takes a long time to “wake up,” resulting in a black square being downloaded to the image folder for use. I had to run the program myself. I was never able to automate the initiation of the system (I wanted the program to run as often and whenever it chose, but I stopped the project before that aspect was developed fully.) But it did mostly work, better than anything I had worked on before. With a few key-strokes, the words I had written (for someone specific to read in a very narrow context) ran through a massive ocean of computing power and data, emerging from a very opaque black box with an image imagined by an alien mind, cheated onto a media platform: an unclear expression of a collaborative network trusted to create something of aesthetic worth (with exactly zero human curation).

This project relies on inputs from various actors to work. All projects, when considered through a new materialist or object-oriented ontological lens, depend upon a range of normally unseen and unconsidered actors. This project tries to linger on, tries to amplify in credible ways, the work of a nonhuman creative network and its actors.

The small programs I started with a year ago existed on a small and mappable network. Some of the paths laid out, like those of P5JS, were huge and complex, but the majority of the system lived in my laptop. The network of connections from any modern smart device (like my laptop) is global in scope and crosses political, scientific, economic, and other boundaries. They are Latour’s hyper-objects, not just one static, graspable thing but a thing across traditional ontological categories and boundaries, once thought to be impossible to cross. I am going to set aside the investigation into the hyper-object nature of laptops generally (and my six-year-old work issue laptop specifically) with the understanding that they are in a number of ways black boxes, but that they are the same black box for the vast majority of graphic designers in the world. With that then set aside, the programs I was writing were primarily self-contained. I understood the inputs, I was responsible for them in various ways.

The sentences I had written (not at all for this purpose) added an element of structure to the project. Essays I had written for Natalia, essays written in response to her prompts, many of which were taken from my copy of Writing for the Design Mind, provided human-made content for the bot. Each sentence might be the one used as the prompt to the text-to-image program; I wrote them, but I did not choose them. That they might be used this way was not at all my intention when writing them, but maybe that highlights the lack of importance of an author’s intentions—of my intentions—in some important way.

The other writing I did for this project was writing code, in javascript, for the P5JS library I was running my programs in. What writing means in these two spaces is in practice very different feeling, but in a metaphorical space of thinking about writing, this opens up something interesting for me. Writing the words into life in code, where the words change things (they literally make things happen), where a value written can change the color of a thing, its size, its shape, its location. I find this to be an interesting thing to consider. Code performs as a performative utterance; the words as written make the thing real. Like the oft-used example of a performative speech act, the words “I now pronounce you man and wife” when spoken during a marriage ceremony, when writing code, the words perform the act.

These inputs I understand. The elements are known, I am in control of them to a large extent. Aspects of the higher level and more abstract aspects of Javascript are opaque, but they are learnable, discoverable.

This project differed from my smaller localized programs. This network was tied into numerous large and unknowable black boxes. The text-to-image model attnGAN is massive and complex. I know some of how it works, I can affect it, but my input is no longer the performative utterance of my simpler programs. The words I write are now much more like the writing in my essays. They suggest rather than declare, they are shared and interpreted by a second party. In this case, the interpretation and translation is closer to communicating with an alien.

The software also runs through a commercial provider. RunwayML is a company that provides a simplified kind of access to machine learning tools. While helpful, it is another layer of opacity, one more collection of layers added to the network. I am not sure how they create the clean UX of the interface, not sure what is lost or given up to achieve it. It is another reassuringly tidy black box.

The inputs for this project, some known and used as I might have intended, others used or repurposed in odd fashion, were all strange and sometimes chaotic feeling. Strange things go in, stranger things come out. Defining what the final output of this project was and who it was for is unclear. The bot itself, the GAN produced images, the posts on Instagram...these are all in some ways the final output depending on the perceived audience. The text, initially for Natalia, taken largely out of context, was the output intended for the GAN model, which prompted the new image ready for upload, triggering the publication of the image and text...each aspect of the network working relationally with the rest, until finally bringing in the audience (Instagram scroller) at the end of the process. An audience seeing, interacting, decoding, and reading the image is also a creative act; a part of the meaning making. The images that the bot created were never intended to be edited or reviewed—although I did delete one post that had something that looked a lot like a swastika in it, and I also removed the all black outputs created due to technical errors with RunwayML (I didn’t see them until I refreshed Instagram).

The outputs of this project are closer to atonal jazz or automatic drawing. If you can see something of interest or hear something pleasing, that is just as much about your personal interpretation as it is about the output. The images generated from the attnGAN are never obvious representations of the content. They are disconnected abstractions of form and color. They are melting and horrible and abstract and beautiful. They are alien. They are maybe not really for us; it often feels like they aren’t. They are brave and expressive and have no fear (the antithesis of an Instagram influencer).

This project was a success or very near to one in my mind. The collaboration was rooted in a place of trust and care in a way that I had not been able to make happen or participate in in prior projects (save, perhaps, the first: my Camera Obscura project). Seeing the images that the model produced, strange as they are, was a clarifying moment for me. The near autonomy of the process, the clarity of the network—stretching out in every direction, but not with me at the center or top—showed me the kind of work that this collaborative trust and care could produce. This move away from control (even if only imagined control) was freeing and exciting. My imagination leapt.

This project ended one day while I was walking Dave Peacock through the details of the network. I made a change to the code, and it broke. It ended. It would have been very easy to fix the code and bring the network back online, making and posting strange melty dream-like images, but I again abandoned them, I withdrew my care. AttnGAN, the machine learning model that was doing the text-to-image work, was just about three years old at this point (a very old program in the world of machine learning), and the non-profit OpenAI had just announced the new model’s CLIP and Dall-E. The leaps in ability are almost impossibly dramatic in these new models. I decided to focus on these for my future projects, and I left @infinite.art.bot broken and finite.

The end.



︎︎︎︎︎︎︎︎︎
Next
Chapter 7 — CONTROL V1 a booklet about care.