That’s very pithy, but the material used as training data was probably produced by artists attempting to create art using tools (ai and otherwise), as well as more mundane data designed and produced by humans with no ai tools and some produced by humans with almost exclusively ai tools.
Don’t chicken/egg this. All of the training data was man-made at some point. Until the first LLMs started outputting based on it.
Secondly, the amount of human-produced content and LLM-produced content that’s in the training data is incomparable. And will continue to be so. Otherwise the models break.
You can’t really have it both ways.
Is the things just a machine that’s following instructions and synthesizing its training data into different things? Then it’s a tool.
Is the things making choices and interpreting your inputs to produce a result? Then it’s an artist.
The painter I buy a commission from is an artist. The ai I use to generate a scene is a tool.
Was the “training data” produced by artists or tools?
I mean, yes?
That’s very pithy, but the material used as training data was probably produced by artists attempting to create art using tools (ai and otherwise), as well as more mundane data designed and produced by humans with no ai tools and some produced by humans with almost exclusively ai tools.
You probably live in a different world than I do.
Don’t chicken/egg this. All of the training data was man-made at some point. Until the first LLMs started outputting based on it.
Secondly, the amount of human-produced content and LLM-produced content that’s in the training data is incomparable. And will continue to be so. Otherwise the models break.