Leon-Etienne Kühr
Leon-Etienne Kühr works as a computer scientist and media artist with methods of information visualization, data science, and artificial intelligence. In his practice, he investigates the mechanisms of AI-driven automation and the phenomena of the increasingly intertwined relationship between the world and its digital representation in generative AI. He is a research assistant and co-director of the AI Lab at the Offenbach University of Art and Design and previously studied media arts at the KHM Cologne and media informatics at the Bauhaus University Weimar.
Beitrag
Generative AI models don't operate on human languages – they speak in tokens. Tokens are computational fragments that deconstruct language into subword units, stored in large dictionaries. These tokens encode not only language but also political ideologies, corporate interests, and cultural biases even before model training begins. Social media handles like realdonaldtrump, brand names like louisvuitton, or even !!!!!!!!!!!!!!!! exist as single tokens, while other words remain fragmented. Through various artistic and adversarial experiments, we demonstrate that tokenization is a political act that determines what can be represented and how images become computable through language.