Artificial intelligence firm Anthropic hits out at copyright lawsuit filed by music publishing corporations, claiming the content ingested into its models falls under ‘fair use’ and that any licensing regime created to manage its use of copyrighted material in training data would be too complex and costly to work in practice
GenAI tools ‘could not exist’ if firms are made to pay copyright::undefined
The fact that the “AI” can spit out whole passages verbatim when given the right prompts, suggests that there is a big problem here and they haven’t a clue how to fix it.
It’s not “learning” anything other than the probable order of words.
I really hate this reduction of gpt models. Is the model probabilistic? Absolutely. But it isn’t simply learning a comprehensible probability of words–it is generating a massively complex conditional probability sequence for words. Largely, humans might be said to do the same thing. We make a best guess at the sequence of words we decide to use based on conditional probabilities along a myriad number of conditions (including semantics of the thing we want to say).
The fact that the “AI” can spit out whole passages verbatim when given the right prompts, suggests that there is a big problem here and they haven’t a clue how to fix it.
It’s not “learning” anything other than the probable order of words.
I really hate this reduction of gpt models. Is the model probabilistic? Absolutely. But it isn’t simply learning a comprehensible probability of words–it is generating a massively complex conditional probability sequence for words. Largely, humans might be said to do the same thing. We make a best guess at the sequence of words we decide to use based on conditional probabilities along a myriad number of conditions (including semantics of the thing we want to say).
What about these:
https://arxiv.org/abs/2310.02207
https://notes.aimodels.fyi/researchers-discover-emergent-linear-strucutres-llm-truth/
https://notes.aimodels.fyi/self-rag-improving-the-factual-accuracy-of-large-language-models-through-self-reflection/