Interface After AI: Rethinking AI Interfaces

Roberto Alonso Trillo (Editor), Marek Poliks (Editor)

Research output: Other contributionpeer-review


This special issue aims to address the limitations imposed by existing approaches to interface design on the current epoch of deep learning technology and to explore how insights from musical and artistic instrumentality may help unblock further development.

When surveying the highest-adoption commercial and creative deep learning tools in 2024 (e.g. ChatGPT, MidJourney, Gemini, Runway), it is possible to identify a strong sense of convergence or sameness: an user ideates, expresses their idea in linguistic or other symbolic form, and a program leverages a small number of possible toolchains (PyTorch/Tensorflow, HuggingFace, CUDA, etc…) to iterate upon and execute a response. One might call this workflow a transcriptive interface: a downstream motion from user (idea) to computer (executor) through various stages of symbolic translation. This type of interface tends to yield a specific user experience: a chatbot or chat platform, a language prompt, a visual display with a blinking cursor.

Rethinking the transcriptive approach to interface design is a crucial step to accelerate the diversification of AI toolchains, processes, and relationships. One can look to the arts for many alternative interface design paradigms. For example, the field of musical instrumentality and instrumental performance has enumerated a massive typology of nonlinear, cocreative relationships between humans, materials, and sounds that stand in stark relief to a “blinking cursor” on a chat window. We argue that, by applying insights from these alternative interfacial regimes, we can unlock new possibilities for AI to become a creative partner, rather than merely a tool for human ideators.

Destabilizing the 'blinking cursor' requires deeper work than a translation of its unilateral input-output flow into other forms and formats. To paraphrase Caroline Busta and Lil Internet -- in an era in which content production is essentially limitless and practically free, the scope of art-making and artistic thinking must move to the level of the protocol, the systems design, the "urban planning" of the plumbing networks of generative mechanics. Our objective is to open the floodgates to speculative redesigns, to theoretical design paradigms, and to infrastructural modes that completely upheave the input-output relationship (what Luciana Parisi might call the servo-mechanical relationship) altogether. While practical implementations will be considered, we will prioritize big, projective, crazy, conceptual work within the broader field of HAIID [Human-AI-Interaction-Design].

We invite researchers, musicologists, artists, and technologists to submit their work on reimagining interfaces that foster expanded interactions between humans and AI.

Themes and Topics
Contributions to this special issue may include, but are not limited to, the following areas of inquiry:

Nonlinear Deep Learning Interfaces: Proposals for interfaces that depart from traditional visual, semantic, and linear paradigms, enabling new forms of creative expression, exploration, and collaboration with AI.

Interfacial Regimes in Music and the Arts: Examination of interfacial frameworks within music and artistic practices that provide alternative models for engaging with deep learning technologies.

Post-Transcriptive AI: Exploration of new models, toolchains, and possibilities for deep learning outside of strictly transcriptive functions and relationships.

New Musical Paradigms: Visions of music after the development of deep learning tools unbound from transcriptive modalities.
Original languageEnglish
TypeJournal issue
PublisherOrpheus Institute
Publication statusAccepted/In press - 14 May 2024

Publication series

NameECHO Journal
PublisherOrpheus Institute
ISSN (Electronic)2736-5824

Scopus Subject Areas

  • Music
  • Arts and Humanities (miscellaneous)
  • Artificial Intelligence


Dive into the research topics of 'Interface After AI: Rethinking AI Interfaces'. Together they form a unique fingerprint.

Cite this