AI, neural networks and style appropriation
Borrowing, appropriation and intertextuality are particularly interesting concepts to explore given the recent acceleration in art and music produced using Machine Learning. Every network which is used to create art or music functions only because it is trained on a dataset of pre-existing work. Some of these datasets may be relatively modest – Dadabots’ Relentless Doppelgänger was trained on the music of technical death metal band Archspire. Others are huge – Stability.ai’s Stable Diffusion was trained on LAION 2B-en dataset, which contains over two billion text-image pairs. What does ‘style’ mean in this context? How is it expressed? And who owns it?
Jennifer Walshe is Professor of Composition at Oxford and Fellow of Worcester College.
CJ Carr is Head of Audio Research at StabilityAI. He is also one of the Dadabots.
Christine McLeavey is a Member of the Technical Staff at OpenAI, where she recently published Jukebox and MuseNet. She studied physics at Princeton University, and neuroscience and medicine at Stanford University. She is also a trained classical pianist.
Free to attend, no registration required. Please click this link to join the Zoom meeting.