That’s fair; the person I first responded to seemed to be discussing the “fine art” part.
How worried are you personally about these more advanced machine learning tools? Just a month or so ago I was playing a Pathfinder (rpg) video game and didn’t care or any of the built-in avatar images, so I hopped onto one of the websites that make an image based on a text prompt to make me an image that matched my character and it took a few times to get the right wording but in the end I got a pretty good image out of it. I vaguely know how it works. (vaguely is doing a lot of heavy lifting in that sentence) and it still seemed kind of like magic.
the clients (no offense) tend to be less professional
I don’t know what you mean by this precisely, but the “pretty good” end result I mentioned had a hand that melted into the sword-- so if you meant “low standard” then yeah, guilty as charged, haha. However, more interesting to me is that I would have never in 1000 years have paid someone to do that for me-- I just would have been low-level annoyed that my character and the avatar looked different the entire game.
I find the “they didn’t have permission to train from” argument is complete bunk. That’s not a right granted by intellectual property laws; there is no “right to control who learns from a work”.
What needs to happen is society (especially US society) needs to stop linking “working” and “enjoying a comfortable life”. Technology is coming for all our jobs, and the sooner we accept that and prepare for it, the better we’ll be when it happens.
I find the “they didn’t have permission to train from” argument is complete bunk. That’s not a right granted by intellectual property laws; there is no “right to control who learns from a work”.
Yeah, but is an AI LEGALLY learning? Or is it just a machine that spits out output based on its inputs? In that case, use of the work as input isn’t allowed under the copyright, which is that the work be used by reading it.
All these comparisons between what an AI is doing and what a human does when reading/learning/etc are not a given in a court of law. We don’t have any rulings yet that an AI is actually “learning” like a human when it is “trained.”
“Training” an AI is building a tool. A tool that can be used to profit. Can artistic works be used to build a for-profit tool without permission?
This is something that needs to be decided, and it will be decided in a way that whatever the rules are for AI can’t be applied to a human. Meaning if there is a requirement for permission for use in machine learning, that won’t change that a human can learn from it. So the comparison is pointless, because there is no way the courts are going to rule that these things are legally indistinguishable from people.
In the meantime, back to the original, there ARE precedents for use of performance because of recordings. That’s why the studios wanted that in the contract, they KNOW they cannot manipulate a person’s performance through AI without their express written permission. Is it REALLY so hard to believe this can be applied to writing or art? That they can’t use writing or art without the artist’s express permission.
We may see a new kind of copyright soon that specifically disallows use for AI, and another that is open for use with AI. Something to replace Creative Commons on the internet.
All right, I am not a lawyer but I’ve been around the internet long enough to know there is arguably a right to control learning and training. Because the Fair Use copyright law SPECIFICALLY allows for educational use. That means the default is that otherwise, it would not be allowed.
A judge could easily rule that AI training is not covered under Fair Use, as it is being used to create a profitable tool.
That’s a right to make copies and distribute them for educational purposes. This is specifically not involving distribution of any kind. Arguably copyright law doesn’t even apply, but even under the broader term of “intellectual property”, it doesn’t hold up, even without trying to make a comparison between humans learning and AI training. (which is more of an analogy)
Edit: and to be fair, I’m not a lawyer either, but IP law (especially regarding how terrible it has become) is kind of a hobby of mine. But I can’t claim to be any type of authority on it.
And the thing is, if they are using works written by others to build an AI for profit without permission, that’s exploitation. Copyright law is horrible and exploited by corporations constantly. That doesn’t mean we shouldn’t cheer on the little guy when they try to use it to defend against exploitation by corporations. Because the big tech companies are exploiting creatives in their drive to build and sell this tool. They are exploiting creatives to make their replacements. So I’m going to go off on any comparison analogy.
Whatever the actual basis of the lawsuits against the AI companies, actual lawyers do think there’s a basis in IP law to sue because a few high profile lawsuits have been filed. And clearly there is some legal basis to sue if they use AI to create using performances, or this contract would not have been proposed.
By that I mean that some dude looking for a game avatar or whatever isn’t as likely to be someone used to contracting people to do professional creative work for them. Professional clients who are accustomed to hiring creatives to do work for them are more likely to:
be quick to provide feedback and respond to emails
have feedback that is clear and actionable
communicate professionally
pay on time
be willing to pay a down payment and sign a contract
comprehend the hard work that goes into this stuff and value my time accordingly
don’t try to push a project (either intentionally or otherwise) into what is known as “scope creep” wherein they jockey for additional work outside of the initially agreed upon scope of work
And lots of other little things like that.
Am I saying that you are like that? No. But having done creative work for north of 15 years this is my informed opinion based on a lot of experience in this field.
I find the “they didn’t have permission to train from” argument is complete bunk. That’s not a right granted by intellectual property laws; there is no “right to control who learns from a work”.
That’s your opinion and you might feel differently if you had spent years working hard to achieve something in this specific field.
What needs to happen is society (especially US society) needs to stop linking “working” and “enjoying a comfortable life”. Technology is coming for all our jobs, and the sooner we accept that and prepare for it, the better we’ll be when it happens.
This I fully agree with. And I wouldn’t even necessarily have a problem with AI destroying creative jobs if it meant I was now more free to pursue a life of spending time doing things that I was passionate about because some kind of UBI or whatever was making that possible for me and others.
Like I kinda mentioned earlier, I don’t think society is in a good place to fight for this on at least the short term. Basically not until things get really bad. What I expect will happen for now is most all of the windfalls from automation will be siphoned up to to the upper class and corporations and wages will continue to stagnate for the working class and income inequality will continue to skyrocket.
That’s your opinion and you might feel differently if you had spent years working hard to achieve something in this specific field.
It’s not really an opinion; it’s just not a right granted by IP laws. I know that people that are financially dependent on this type of work really wish they had this right-- and I fully accept that if I were in the same boat, I would probably also wish I had this right, but that doesn’t magically add it to the law.
All the lawsuits you see popping up are hail marys (maries?); they’ll very likely all lose.
some kind of UBI or whatever was making that possible for me and others.
Something like this, set at a level that allowed a comfortable life (versus an austere one) would totally flip the whole employment dynamic. The pay for the worst jobs would skyrocket, because no one wants to do those jobs-- they only do them now to stave off starvation and homelessness.
siphoned up to to the upper class and corporations and wages will continue to stagnate for the working class and income inequality will continue to skyrocket
I can’t help but agree, with sorrow. I imagine it won’t get better (in the US, at least) until it impacts the wealthy-- as in, there aren’t enough people getting paid to buy the stuff that is getting created by automation. Capitalism needs money flowing to the bottom (traditionally, a wage) to sustain itself. If that flow of money dries up, the whole system collapses. We can either fix it by abandoning capitalism, or by patching capitalism by finding a way for money to flow down other than by wages. (A UBI, for example)
In my earlier comment in this thread I said my dilemma with using AI for my work was an ethical one, not a legal one. Ethics/morals inform laws for sure, but I think you’d agree that not everything that’s technically legal is also ethical. Especially so in a country like the US.
I think a lot of people would also agree that ethics are to some extent individual. Meaning that what I find ethical or not is going to differ from others. So whether or not this is all legal doesn’t mean that it’s going to jive with my personal view of what is ethical.
That dilemma is my own. Whether or not congress people who have a weak grasp of both technology and the arts think one way or another on the matter is a poor ruler for one’s own moral code of conduct in my book.
In any case, good chat. I appreciate that while we don’t agree on everything we kept it civil. Now back to work for me (before it gets taken by a robot).
I understand that you may not reply because you feel the discussion has run its course, but I wanted to clarify that I was, indeed, not following that you were speaking from a personal morality standpoint. Sorry about that.
deleted by creator
That’s fair; the person I first responded to seemed to be discussing the “fine art” part.
How worried are you personally about these more advanced machine learning tools? Just a month or so ago I was playing a Pathfinder (rpg) video game and didn’t care or any of the built-in avatar images, so I hopped onto one of the websites that make an image based on a text prompt to make me an image that matched my character and it took a few times to get the right wording but in the end I got a pretty good image out of it. I vaguely know how it works. (vaguely is doing a lot of heavy lifting in that sentence) and it still seemed kind of like magic.
deleted by creator
I don’t know what you mean by this precisely, but the “pretty good” end result I mentioned had a hand that melted into the sword-- so if you meant “low standard” then yeah, guilty as charged, haha. However, more interesting to me is that I would have never in 1000 years have paid someone to do that for me-- I just would have been low-level annoyed that my character and the avatar looked different the entire game.
I find the “they didn’t have permission to train from” argument is complete bunk. That’s not a right granted by intellectual property laws; there is no “right to control who learns from a work”.
What needs to happen is society (especially US society) needs to stop linking “working” and “enjoying a comfortable life”. Technology is coming for all our jobs, and the sooner we accept that and prepare for it, the better we’ll be when it happens.
Yeah, but is an AI LEGALLY learning? Or is it just a machine that spits out output based on its inputs? In that case, use of the work as input isn’t allowed under the copyright, which is that the work be used by reading it.
All these comparisons between what an AI is doing and what a human does when reading/learning/etc are not a given in a court of law. We don’t have any rulings yet that an AI is actually “learning” like a human when it is “trained.”
“Training” an AI is building a tool. A tool that can be used to profit. Can artistic works be used to build a for-profit tool without permission?
This is something that needs to be decided, and it will be decided in a way that whatever the rules are for AI can’t be applied to a human. Meaning if there is a requirement for permission for use in machine learning, that won’t change that a human can learn from it. So the comparison is pointless, because there is no way the courts are going to rule that these things are legally indistinguishable from people.
In the meantime, back to the original, there ARE precedents for use of performance because of recordings. That’s why the studios wanted that in the contract, they KNOW they cannot manipulate a person’s performance through AI without their express written permission. Is it REALLY so hard to believe this can be applied to writing or art? That they can’t use writing or art without the artist’s express permission.
We may see a new kind of copyright soon that specifically disallows use for AI, and another that is open for use with AI. Something to replace Creative Commons on the internet.
There simply isn’t a right to control even training. That’s just not a thing. It would need a change to the law.
All right, I am not a lawyer but I’ve been around the internet long enough to know there is arguably a right to control learning and training. Because the Fair Use copyright law SPECIFICALLY allows for educational use. That means the default is that otherwise, it would not be allowed.
A judge could easily rule that AI training is not covered under Fair Use, as it is being used to create a profitable tool.
That’s a right to make copies and distribute them for educational purposes. This is specifically not involving distribution of any kind. Arguably copyright law doesn’t even apply, but even under the broader term of “intellectual property”, it doesn’t hold up, even without trying to make a comparison between humans learning and AI training. (which is more of an analogy)
Edit: and to be fair, I’m not a lawyer either, but IP law (especially regarding how terrible it has become) is kind of a hobby of mine. But I can’t claim to be any type of authority on it.
Okay, well my hobby is ethics.
And the thing is, if they are using works written by others to build an AI for profit without permission, that’s exploitation. Copyright law is horrible and exploited by corporations constantly. That doesn’t mean we shouldn’t cheer on the little guy when they try to use it to defend against exploitation by corporations. Because the big tech companies are exploiting creatives in their drive to build and sell this tool. They are exploiting creatives to make their replacements. So I’m going to go off on any comparison analogy.
Whatever the actual basis of the lawsuits against the AI companies, actual lawyers do think there’s a basis in IP law to sue because a few high profile lawsuits have been filed. And clearly there is some legal basis to sue if they use AI to create using performances, or this contract would not have been proposed.
By that I mean that some dude looking for a game avatar or whatever isn’t as likely to be someone used to contracting people to do professional creative work for them. Professional clients who are accustomed to hiring creatives to do work for them are more likely to:
And lots of other little things like that.
Am I saying that you are like that? No. But having done creative work for north of 15 years this is my informed opinion based on a lot of experience in this field.
That’s your opinion and you might feel differently if you had spent years working hard to achieve something in this specific field.
This I fully agree with. And I wouldn’t even necessarily have a problem with AI destroying creative jobs if it meant I was now more free to pursue a life of spending time doing things that I was passionate about because some kind of UBI or whatever was making that possible for me and others.
Like I kinda mentioned earlier, I don’t think society is in a good place to fight for this on at least the short term. Basically not until things get really bad. What I expect will happen for now is most all of the windfalls from automation will be siphoned up to to the upper class and corporations and wages will continue to stagnate for the working class and income inequality will continue to skyrocket.
It’s not really an opinion; it’s just not a right granted by IP laws. I know that people that are financially dependent on this type of work really wish they had this right-- and I fully accept that if I were in the same boat, I would probably also wish I had this right, but that doesn’t magically add it to the law.
All the lawsuits you see popping up are hail marys (maries?); they’ll very likely all lose.
Something like this, set at a level that allowed a comfortable life (versus an austere one) would totally flip the whole employment dynamic. The pay for the worst jobs would skyrocket, because no one wants to do those jobs-- they only do them now to stave off starvation and homelessness.
I can’t help but agree, with sorrow. I imagine it won’t get better (in the US, at least) until it impacts the wealthy-- as in, there aren’t enough people getting paid to buy the stuff that is getting created by automation. Capitalism needs money flowing to the bottom (traditionally, a wage) to sustain itself. If that flow of money dries up, the whole system collapses. We can either fix it by abandoning capitalism, or by patching capitalism by finding a way for money to flow down other than by wages. (A UBI, for example)
In my earlier comment in this thread I said my dilemma with using AI for my work was an ethical one, not a legal one. Ethics/morals inform laws for sure, but I think you’d agree that not everything that’s technically legal is also ethical. Especially so in a country like the US.
I think a lot of people would also agree that ethics are to some extent individual. Meaning that what I find ethical or not is going to differ from others. So whether or not this is all legal doesn’t mean that it’s going to jive with my personal view of what is ethical.
That dilemma is my own. Whether or not congress people who have a weak grasp of both technology and the arts think one way or another on the matter is a poor ruler for one’s own moral code of conduct in my book.
In any case, good chat. I appreciate that while we don’t agree on everything we kept it civil. Now back to work for me (before it gets taken by a robot).
I understand that you may not reply because you feel the discussion has run its course, but I wanted to clarify that I was, indeed, not following that you were speaking from a personal morality standpoint. Sorry about that.
No worries