ML Art Installation – "From Machine for Machines"
Autonomous art piece using machine learning to interact with mobile devices. Explores the complex relationship between artificial intelligence and human connectivity through responsive digital sculptures that evolve based on audience interaction patterns.
Autonomous Art
Self-evolving installation that learns and adapts without human intervention
Mobile Interaction
Direct communication between AI system and visitors' personal devices
AI Ethics Exploration
Critical examination of machine autonomy and human-AI relationships
Technology Used
Goals
In this project, we explore how an interactive art installation, using machine learning, can explore ideas around the Anthropocene and encourage reflection on our humankind's relationship with machines. We created a physical installation, where users could interact with mobile phones with ML art pieces. Each mobile phone had input to the interaction and changed the morphing of the visuals. In essence, mobile phones were the audience of this installation.
Solution
The machine‑learning algorithm was DCGAN trained on GPU‑enabled Google Cloud. The input was imagery from WikiArt across abstract, portraits and impressionism. The process produced more than 1,000 generations and ~10,000 images.
These outputs were then combined with volume metamorphosis and made interactive in openFrameworks, enabling real‑time morphing driven by each visitor’s device.