I asked AI - specifically DALL-E-2 by OpenAI - to generate images depicting what it "thinks" the effects of climate change will be on the landscape, specifically the Indiana forests where I live and work. I spent hours honing my prompts, since subtle changes result is vastly different images, and I ran prompts repeatedly, since it offers up different images each time the prompt is run. I curated the 18 images that most spoke to me.
The images generated are compelling for several reasons. The landscapes depicted are anything but inviting, with bare, perhaps burnt trees, red and orange skies, and tangles of invasive plant growth. The colors are mostly on the warm side - despite my having asked for only earth tones; perhaps it will be impossible to depict a warming world without bright reds and oranges, as if those will be the earth tones of the future. Patches of green can be read two ways - as tangles of invasives, or as little patches of hope that the forests are more resilient than we imagine.
I used the 18 images to create patterns for my stitched paper artwork. I printed recycled junk mail with a gel plate, creating papers to match the colors and textures in the AI images. These papers were hand cut and machine stitched to create 18 artworks, 8"x8" each. The artworks were then mounted on prepared birch board, and they are intended to be hung together.
I was surprised by the emotions with which I responded to the AI-generated images. They are, of course, based on the specific words I chose for my prompts, but DALL-E-2 returns images based on its training, and it was trained on billions of text-image combos, so ultimately from some collective mind. How much do these images reflect how we humans collectively feel about what is to come?
Copyright © 2008 - 2024 Michele Pollock - All Rights Reserved.
Powered by GoDaddy