top of page

Examining the prevalence and limits of computer vision

Derivations

gray_area_ai_exhibit-054.jpg

The Derivations Project creates a visual dialogue between artists, machines and viewers.  Artists in the Gray Area incubator program were interviewed about their art practice, the role of the artist, and stable diffusion.  Then I created machine learning pipelines to generate visual outputs based on their words.  These outputs are transposed onto the original interview footage and display projected on top of the lens and a cubic grid system.

 

 The incongruences between the person’s words and the algorithmic interpretation celebrates the humanity in all of us that can never be boiled down to data points.  Artificial Intelligence algorithms are seen as fair and objective, but in reality, they encode and perpetuate bias in an unjust world.  Google’s facial recognition algorithm labeled black people as gorillas, Amazon's resume selection algorithm taught itself that male candidates were preferable, and the COMPAS Recidivism Algorithm predicted higher recidivism rates for Black defendants despite claiming to offer an unbiased score for sentencing and not taking race as an input.  These models and countless more have caused immeasurable harm to countless millions, and this project hopes to bring this harm to light and show how it is more important than ever to question the algorithms that construct the reality we live in.

gray_area_ai_exhibit-050.jpg
gray_area_ai_exhibit-046.jpg
gray_area_ai_exhibit-050.jpg
gray_area_ai_exhibit-048.jpg
gray_area_ai_exhibit-052.jpg

Art sales and collaboration inquires welcome.

contact keck.quinn<at>gmail.com

  • Instagram - algorithmic.misperceptions
  • Github
  • LinkedIn
  • Black Instagram Icon

© 2025 By Quinn Keck 

bottom of page