On AI-created works
[Originally posted in a different space, March 2023; lightly edited, link added.]
A core issue is not necessarily about a machine generating output per se. It's at least partly about artists' work being used as inputs.
Whereas a human artist can indeed use the art they've seen to develop a style that's either imitatory or at least ‘influenced by’, and artists have always done that, doing so requires a non-trivial amount of labour on the part of the artist: both in learning the style, and in actually creating new works as a result.
But there's no direct human labour involved when an AI learns an artist's style and reproduces it. It's a sudden massive lowering of the barrier to being able to imitate an artist who might have spent many many hours over the years developing their individual style.
So the question is, to what extent is this ethical? And to what extent does this change according to context?
i don't have the answers. As a coder, i'm certainly uneasy with the fact that my code on GitHub is being used as an input to Copilot regardless of my wishes, and regardless of the licenses i release my code under. And i think the latter is a big part of my uneasiness - it's not that my work is being used for this sort of purpose per se, it's that corporations - and highly-profitable corporations at that - are using my work to improve their bottom line, without me having first given permission to use my work as input to a Large Language Model (LLM).
It's not like i'm inherently opposed to this overall; some of my code i've released under the ISC license[a], which allows anyone to build a product on it and make a profit without needing my permission, and without having to financially compensate me in some way. But i put the code under the ISC license with informed consent: i was aware of what that might entail, and i consented to it. It's not clear to me that any code i've released under any license is ‘obviously’ legally or morally available to be used as an input to an LLM.☙