Experiment 01
Made with Google, AttnGAN + DenseCAP

The initial stage of this project was created by plugging screenshots from different photo spheres on Google Earth into Google’s reverse image search function. Google’s algorithm generates large-scale sets of images that appear to be “visually similar” to the primary photograph. The primary photograph is on the cover of the book. Google’s algorithm uses the defining characteristics to determine its assumption about what is in the picture. Experiment 01 is divided into 3 sections. The photographs from the first section are results from Google’s reverse image search engine. The second section was created using an algorithm called AttnGAN; a machine that generates synthetic images based on text descriptions. Each image from the first section is described in my own words as an assumption of what I am seeing without context, much like how the algorithms in this experiment function and AttnGAN then produces an image from the description. The third section takes the images created by AttnGAN and uses a captioning algorithm called DenseCAP. This algorithm works by captioning a picture with labels each time the model detects an object. Each label is placed within the detected object’s coordinates.





















































                       

                        

                       

        









© 2022 noah melrose