Why Would You Want To Picture It (2019)
Sound Installation and Sculpture. 4ch, 6:00min. Painted steel, 3m x 1m.
A sculpture of a neural network diagram floats in the gallery, taken from a research paper and rendered ten feet long in painted steel. It is oscillating ever so slightly, making it hard to discern whether it is spatial at all. The room is filled by a synthetic voice recounting the experience of being a vector inside a neural network, the composition of the landscapes, its rules and poetry.
The piece offers an experience of the disconnect between what can be observed, reasoned with, communicated, and that which emerges from within artificial neural networks invisibly and un-imaginably: their “intelligence”. It asks to what extent the world can be modeled through mathematics, and conversely, whether we can understand these models. Can we understand what we cannot observe or imagine?
Close
Writing, Why Would You Want To Picture It (2019)
The Chair Project (Four Classics)
Chrome Steel, Wood
The Chair Project is a series of four AI-designed, human-manufactured chairs. The piece reverses the roles of human and machine in the design process and industrial production.
By omitting the human body, the technology dreams up entirely new forms of classic seating furniture. Chairs, both ordinary and alien, yet ultimately useless — an irony of AI solutionism.
Collaboration with Steffen Weiß
Full-scale models commisioned by MAK Vienna for Vienna Biennale 2019. Prototype design and manufacturing by Mikkel Mikkelsen.
Publication:
The Chair Project: A Case-Study for using Generative Machine Learning as Automatism. ML for Creativity and Design Workshop. NeurIPS 2018, Montréal.
Close
The Chair Project (Four Classics) (2019)
The Chair Project (Four Classics). Photo: Aslan Kudrnofsky/MAK (2019)
The Chair Project (Four Classics) (2019)
The Chair Project (Four Classics) (2019)
The Chair Project (Four Classics) (2019)
The Chair Project (Four Classics) (2019)
Introspections
Sending a blank canvas to a machine learning model meant to process photos returns … a blank image. But repeat the action over and over and the model starts to introduce its own artifacts. It's subtle at first, but ultimately the model devours the image, creating an abstract visualization of its own inner state and architecture. An introspection made visible, the piece materializes a glimpse into the opaque, high-dimensional vector spaces in which AI makes its meaning.
Created during a residency at Runway AI.
Close
Introspection (Automatic Colorization, 37 iterations). Giclée print, 15x15in. Edition of 10 (2019)
Introspection (PhotoSketch, 214 iterations). Giclée print, 15x15in. Edition of 10 (2019)
Introspection (ESRGAN, 58 iterations). Giclée print, 15x15in. Edition of 10 (2019)
Learning To See (2019)
Coloring Book of images used to train artificial intelligence. 16 pages, inkjet. Edition of 200.
The back cover reads: "Evolution over millenia attuned human vision to spot predators. This book helps children to see the world more like the predators of our time: autonomous systems using computer vision algorithms in hunting grounds from marketing to drone warfare. ‘Learning to See’ aims to give kids an evolutionary advantage and is fun along the way."
Buy Here
Close
Learning To See (2019)
Learning To See (2019)
Learning To See (2019)
Learning To See (2019)
Learning To See (2019)
Humans of AI (2019)
Online exhibition comprised of three pieces based on a reverse-engineering of the COCO Computer vision dataset
The data by which machine learning algorithms learn to make predictions is hardly ever shown, let alone credited. By doing both, Humans of AI exposes the myth of magically intelligent machines, instead applauding the photographers who made the technical achievement possible. In fact, when showing the actual training pictures, credit is not only due but mandatory.
https://humans-of.ai
This work was supported by NYU ITP.
Close
Humans of AI: Declassifier. Custom Software (2019)
Humans of AI: Slideshow. Animation, 6 days 20 hours 17 min (2019)
Humans of AI: Certificates. 34,248 Certificates (2019)
Computed Curation (2017)
Computed Curation is a photobook created by a computer. Taking the human editor out of the loop, it uses machine learning and computer vision tools to curate a series of photos from an archive of pictures.
Considering both image content and composition — but through the sober eyes of neural networks, vectors and pixels — the algorithms uncover unexpected connections and interpretations that a human editor might have missed.
Machine learning based image recognition tools are already adept at recognizing training images (umbrella, dog on a beach, car), but quickly expose their flaws and biases when challenged with more complex input. In Computed Curation, these flaws surface in often bizarre and sometimes poetic captions, tags and connections. Moreover, by urging the viewer to constantly speculate on the logic behind its arrangement, the book teaches how to see the world through the eyes of an algorithm.
Web Version. Buy through Bromide Books.
Publication: Computed Curation. Ammerman Center for Arts & Technology 16th Biennial Symposium INTERSECTIONS. In 16th Biennial Symposium on Arts & Technology (p. 96).
Close
Computed Curation. 204p, offset, leporello (2017)
Computed Curation. 204p, offset, leporello (2017)
Computed Curation. 204p, offset, leporello (2017)
Genesis (2018)
The title, Genesis, refers to the process of manually creating datasets for generative machine learning (ML) projects. ML requires large amounts of data which, in many cases, is hard to come by. This is why we fall back to a few public repositories and, possibly, why research papers prove concepts by generating kittens or celebrity faces. We can also crowdsource datasets, but this comes with its own limitations and politics.
I sourced my own dataset at the Reanimation Library, a library of misfit books in Brooklyn. My beetles are based on "A Book of Beetles", published by Josef R. Winkler (author) and Vladimir Bohac (illustrator) in 1965. In a laborious process I scanned the book, transformed the data into a pix2pix training set, and used it to train the neural network.
The result, after many hours of work, is 25 bugs. My dataset is way too small for any serious application. Ultimately, the project is a reflection on where ML datasets come from and the politics inherent in their genesis.
Close
Genesis. 25 AI-generated beetles. Digital-C print in wooden display box (2018)
In Search Of An Impossible Object (2018)
statement pending
Close
A Computer Walks Into A Gallery …
Installation, painted wood, 13ft x 8ft, in front of: Michael Morgner, Man Pacing, Mixed media on paper laid on canvas
Michael Morgner's “Schreitender” (en: “Man Pacing”) was shown to a computer vision algorithms trying to “see” features in the image. The installation is a physical manifestation of the computer's limited understanding of the painting and forms of representation of automated vision. It invites to rediscover the work computationally while getting in the way of the actual experience.
Commissioned by DAAD for the Consulate General of Germany New York, USA.
Close
A Computer Walks Into A Gallery... Intervention. Painted wood, 13ft x 8ft (2018)
A Computer Walks Into A Gallery... Intervention. Painted wood, 13ft x 8ft (2018)
A Computer Walks Into A Gallery... Intervention. Painted wood, 13ft x 8ft (2018)
Human Element Inc. (2016)
pending
Close
Raising Robotic Natives (2016)
“We live in times of transition, and as we advance toward the upcoming robotic quickening, much has been said about the ‘uncanny valley’ — that feeling of unease that humanoid objects elicit in some people. Are these fears a social construction or instinctive? Tackling these issues, Raising Robotic Natives presents a series of artifacts for generations who, much like digital natives, are being born into a yet-to-be society with high robot density. The result is a lucid speculation on how the normalcy of robots will change vernacular spaces and domestic infrastructure. Crucially, the reality it portrays avoids robo-apocalyptic scenarios: kids can accustom themselves to reading with the help of a child-friendly illustrated edition of Isaac Asimov's Three Laws of Robotics. Also, in case of need, a "living room kill switch" triggers a short circuit that shuts down all electricity—robots included.”
[src: Broken Nature]
More Information.
Close
Location-Based Light Painting (2014)
caption pending
Close