Greg Rutkowski Was Removed From Stable Diffusion, But AI Artists Brought Him Back - Decrypt

Greg Rutkowski, a digital artist known for his surreal style, opposes AI art but his name and style have been frequently used by AI art generators without his consent. In response, Stable Diffusion removed his work from their dataset in version 2.0. However, the community has now created a tool to emulate Rutkowski’s style against his wishes using a LoRA model. While some argue this is unethical, others justify it since Rutkowski’s art has already been widely used in Stable Diffusion 1.5. The debate highlights the blurry line between innovation and infringement in the emerging field of AI art.

doeknius_gloek,

While some argue this is unethical, others justify it since Rutkowski’s art has already been widely used in Stable Diffusion 1.5.

What kind of argument is that supposed to be? We’ve stolen his art before so it’s fine? Dickheads. This whole AI thing is already sketchy enough, at least respect the artists that explicitly want their art to be excluded.

Otome-chan,
Otome-chan avatar

no one's art is being "stolen". you're mistaken.

Crankpork,

Aside from all the artists whose work was fed into the AI learning models without their permission. That art has been stolen, and is still being stolen. In this case very explicitly, because they outright removed his work, and then put it back when nobody was looking.

I_Has_A_Hat,

Let me give you a hypothetical that’s close to reality. Say an artist gets very popular, but doesn’t want their art used to teach AI. Let’s even say there’s even legislation that prevents all this artist’s work from being used in AI.

Now what if someone else hires a bunch of cheap human artists to produce works in a style similar to the original artist, and then uses those works to feed the AI model? Would that still be stolen art? And if so, why? And if not, what is this extra degree of separation changing? The original artist is still not getting paid and the AI is still producing works based on their style.

wizardbeard,
@wizardbeard@lemmy.dbzer0.com avatar

Fine, you win the semantic argument about the use of the term “stealing”. Despite arguments about word choice, this is still a massively disrespectful and malicious action against the artist.

Crankpork,

So you hire people to trace the original art, that’s still copying it, and nobody is learning anything. It’s copying.

Harrison,

They didn’t say trace. A good artist can use the style of another artist when creating a new work.

Crankpork,

Yeah but a computer can’t, no matter how much people want to believe it can. Not with current tech.

Crankpork,

Comic book artists get in shit for tracing other peoples’ work all the time. Look up Greg Land. It’s shitty regardless of whether it’s a person doing it directly, or if someone built software to do it for them.

CallumWells,

Strictly speaking it wouldn’t exactly be stealing, but I would still consider it as about equal to it, especially with regards to economic benefits. It may not be producing exact copies (which strictly speaking isn’t stealing, but is violating copyright) or actually stealing, but it’s exploiting the style that most people would assume mean that that specific artist made it and thus depriving that artist from benefiting from people wanting art from that artist/in that style.

Now, I’m not conflicted about people who have made millions off their art having people make imitations or copies, those people live more than comfortably enough. But in your example there are still other human artists benefiting, which is not the case for computationally generated works. It’s great for me to be able to have computers create art for a DnD campaign or something, but I still recognize that it’s making it harder for artists to earn a living from their skills. And to a certain degree it makes it so people who never would have had any such art now can. It’s in many ways like piracy with the same ethical framing. And as with piracy it may be that people that use AI to make them art become greater “consumers” of art made by humans as well, paying it forward. But it may also not work exactly that way.

Otome-chan,
Otome-chan avatar

People aren't allowed to produce similar styles to other humans? So do you support disney preventing anyone from making cartoons?

CallumWells,

Now you’re making a strawman. Other humans that are actually making art generally don’t fully copy a specific style, they draw inspiration from different sources and that amalgamation is their style.

Your comment reads as bad-faith to me. If it wasn’t meant as such you’re free to explain your stance properly instead of making strawman arguments.

grue,

That’s true, but only in the sense that theft and copyright infringement are fundamentally different things.

Generating stuff from ML training datasets that included works without permissive licenses is copyright infringement though, just as much as simply copying and pasting parts of those works in would be. The legal definition of a derivative work doesn’t care about the techological details.

(For me, the most important consequence of this sort of argument is that everything produced by Github Copilot must be GPL.)

rikudou,

That’s incorrect in my opinion. AI learns patterns from its training data. So do humans, by the way. It’s not copy-pasting parts of image or code.

grue,

By the same token, a human can easily be deemed to have infringed copyright even without cutting and pasting, if the result is excessively inspired by some other existing work.

Crankpork,

AI doesn’t “learn” anything, it’s not even intelligent. If you show a human artwork of a person they’ll be able to recognize that they’re looking at a human, how their limbs and expression works, what they’re wearing, the materials, how gravity should affect it all, etc. AI doesn’t and can’t know any of that, it just predicts how things should look based on images that have been put in it’s database. It’s a fancy Xerox.

rikudou,

Why do people who have no idea how some thing works feel the urge to comment on its working? It’s not just AI, it’s pretty much everything.

AI does learn, that’s the whole shtick and that’s why it’s so good at stuff computers used to suck at. AI is pretty much just a buzzword, the correct abbreviation is ML which stands for Machine Learning - it’s even in the name.

AI also recognizes it looks at a human! It can also recognize what they’re wearing, the material. AI is also better in many, many things than humans are. It also sucks compared to humans in many other things.

No images are in its database, you fancy Xerox.

Crankpork,

And I wish that people who didn’t understand the need for the human element in creative endeavours would focus their energy on automating things that should be automated, like busywork, and dangerous jobs.

If the prediction model actually “learned” anything, they wouldn’t have needed to add the artist’s work back after removing it. They had to, because it doesn’t learn anything, it copies the data it’s been fed.

rikudou,

Just because you repeat the same thing over and over it doesn’t become truth. You should be the one to learn, before you talk. This conversation is over for me, I’m not paid to convince people who behave like children of how things they’re scared of work.

MJBrune,

At the heart of copyright law is the intent. If an artist makes something, someone can’t just come along and copy it and resell it. The intent is so that artists can make a living for their innovation.

AI training on copyrighted images and then reproducing works derived from those images in order to compete with those images in the same style breaks the intent of copyright law. Equally, it does not matter if a picture is original. If you take an artist’s picture and recreate it with pixel art, there have already been cases where copyright infringement settlements have been made in favor of the original artist. Despite the original picture not being used at all, just studied. Mile’s David Kind Of Bloop cover art.

grue,

You’re correct in your description of what a derivative work is, but this part is mistaken:

The intent is so that artists can make a living for their innovation.

The intent is “to promote the progress of science and the useful arts” so that, in the long run, the Public Domain is enriched with more works than would otherwise exist if no incentive were given. Allowing artists to make a living is nothing more than a means to that end.

MJBrune,

It promotes progress by giving people the ability to make the works. If they can’t make a living off of making the works then they aren’t going to do it as a job. Thus yes, the intent is so that artists can make a living off of their work so that more artists have the ability to make the art. It’s really that simple. The intent is so that more people can do it. It’s not a means to the end, it’s the entire point of it. Otherwise, you’d just have hobbyists contributing.

whelmer,

I like what you’re saying so I’m not trying to be argumentative, but to be clear copyright protections don’t simply protect those who make a living from their productions. You are protected by them regardless of whether you intend to make any money off your work and that protection is automatic. Just to expand upon what @grue was saying.

Otome-chan,
Otome-chan avatar

It's actually not copyright infringement at all.

Edit: and even if it was, copyright infringement is a moral right, it's a good thing. copyright is theft.

grue,

Edit: …copyright infringement is a moral right, it’s a good thing. copyright is theft.

Except when it’s being used to enforce copyleft.

MJBrune,

It’s likely copyright infringement but that’s for the courts to decide, not you or me. Additionally, “copyright infringement is a moral right” seems fairly wrong. Copyright laws currently are too steep and I can agree with that but if I make a piece of art like a book, video game, or movie, do I not deserve to protect it in order to get money? I’d argue that because we live in a capitalistic society so, yes, I deserve to get paid for the work I did. If we lived in a better society that met the basic needs (or even complex needs) of every human then I can see copyright laws being useless.

At the end of the day, the artists just want to be able to afford to eat, play games, and have shelter. Why in the world is that a bad thing in our current society? You can’t remove copyright law without first removing capitalism.

grue,

Additionally, “copyright infringement is a moral right” seems fairly wrong. Copyright laws currently are too steep and I can agree with that but if I make a piece of art like a book, video game, or movie, do I not deserve to protect it in order to get money? I’d argue that because we live in a capitalistic society so, yes, I deserve to get paid for the work I did.

No. And it’s not just me saying that; the folks who wrote the Copyright Clause (James Madison and Thomas Jefferson) would disagree with you, too.

The natural state of a creative work is for it to be part of a Public Domain. Ideas are fundamentally different from property in the sense that property’s value comes from its exclusive use by its owner, wheras an idea’s value comes from spreading it, i.e., giving it away to others.

Here’s how Jefferson described it:

stable ownership is the gift of social law, and is given late in the progress of society. it would be curious then if an idea, the fugitive fermentation of an individual brain, could, of natural right, be claimed in exclusive and stable property. if nature has made any one thing less susceptible, than all others, of exclusive property, it is the action of the thinking power called an Idea; which an individual may exclusively possess as long as he keeps it to himself; but the moment it is divulged, it forces itself into the possession of every one, and the reciever cannot dispossess himself of it. it’s peculiar character too is that no one possesses the less, because every other possesses the whole of it. he who recieves an idea from me, recieves instruction himself, without lessening mine; as he who lights his taper at mine, recieves light without darkening me. that ideas should freely spread from one to another over the globe, for the moral and mutual instruction of man, and improvement of his condition, seems to have been peculiarly and benvolently designed by nature, when she made them, like fire, expansible over all space, without lessening their density in any point; and like the air in which we breathe, move, and have our physical being, incapable of confinement, or exclusive appropriation. inventions then cannot in nature be a subject of property. society may give an exclusive right to the profits arising from them as an encouragement to men to pursue ideas which may produce utility. but this may, or may not be done, according to the will and convenience of the society, without claim or complaint from any body.

Thus we see the basis for the rationale given in the Copyright Clause itself: “to promote the progress of science and the useful arts,” which is very different from creating some kind of entitlement to creators because they “deserve” it.

The true basis for copyright law in the United States is as a utilitarian incentive to encourage the creation of more works - a bounty for creating. Ownership of property is a natural right which the Constitution pledges to protect (see also the 4th and 5th Amendments), but the temporary monopoly called copyright is merely a privilege granted at the pleasure of Congress. Essentially, it’s a lease from the Public Domain, for the benefit of the Public. It is not an entitlement; what the creator of the work “deserves” doesn’t enter into it.

And if the copyright holder abuses his privilege such that the Public no longer benefits enough to be worth it, it’s perfectly just and reasonable for the privilege to be revoked.

At the end of the day, the artists just want to be able to afford to eat, play games, and have shelter. Why in the world is that a bad thing in our current society? You can’t remove copyright law without first removing capitalism.

This is a bizarre, backwards argument. First of all, a government-granted monopoly is the antethesis of the “free market” upon which capitalism is supposedly based. Second, granting of monopolies is hardly the only way to accomplish either goal of “promoting the progress of science and the useful arts” or of helping creators make a living!

MJBrune,

Thus we see the basis for the rationale given in the Copyright Clause itself: “to promote the progress of science and the useful arts,” which is very different from creating some kind of entitlement to creators because they “deserve” it.

… You realize the reason it promotes progress is because it allows the creators to get paid for it, right? It’s not “they deserve it” it’s “they need to eat and thus they aren’t going to do it unless they make money.” Which is exactly my argument.

Ownership of property is a natural right which the Constitution pledges to protect (see also the 4th and 5th Amendments), but the temporary monopoly called copyright is merely a privilege granted at the pleasure of Congress

It’s a silly way to put that since the “privilege granted” is given in to Congress in the Constitution.

Overall though, you are referencing a 300-year-old document like it means something. The point comes down to people needing to eat in a capitalistic society.

This is a bizarre, backwards argument. First of all, a government-granted monopoly is the antethesis of the “free market” upon which capitalism is supposedly based.

Capitalism isn’t really based on a free market and never has been in practice.

Second, granting of monopolies is hardly the only way to accomplish either goal of “promoting the progress of science and the useful arts” or of helping creators make a living!

Sure but first enact those changes then try to change or break copyright. Don’t take away the only current way for artists to make money then say “Well, the system should be different.” You are causing people to starve at that point.

FaceDeer,
FaceDeer avatar

His art was not "stolen." That's not an accurate word to describe this process with.

It's not so much that "it was done before so it's fine now" as "it's a well-understood part of many peoples' workflows" that can be used to justify it. As well as the view that there was nothing wrong with doing it the first time, so what's wrong with doing it a second time?

Kara, (edited )
Kara avatar

I don't like when people say "AI just traces/photobashes art." Because that simply isn't what happens.

But I do very much wish there was some sort of opt-out process, but ultimately any attempt at that just wouldn't work

chemical_cutthroat,
chemical_cutthroat avatar

People that say that have never used AI art generation apps and are only regurgitating what they hear from other people who are doing the same. The amount of arm chair AI denialists is astronomical.

ricecake,

There’s nothing stopping someone from licensing their art in a fashion that prohibits their use in that fashion.
No one has created that license that I know of, but there are software licenses that do similar things, so it’s hardly an unprecedented notion.

The fact of the matter is that before people didn’t think it was necessary to have specific usage licenses attached to art because no one got funny feelings from people creating derivative works from them.

Zeus,

pirating photoshop is a well-understood part of many peoples’ workflows. that doesn’t make it legal or condoned by adobe

FaceDeer,
FaceDeer avatar

I don't know what this has to do with anything. Nothing was "pirated", either.

Backspacecentury,

Was he paid for his art to be included?

Kichae,

His work was used in a publicly available product without license or compensation. Including his work in the training dataset was, to the online vernacular use of the word, piracy.

They violated his copyright when they used his work to make their shit.

FaceDeer,
FaceDeer avatar

The product does not contain his work. So no copying was done, therefore no "piracy."

Zeus,

i’m not making a moral comment on anything, including piracy. i’m saying “but it’s part of my established workflow” is not an excuse for something morally wrong.

only click here if you understand analogy and hyperboleif i say “i can’t write without kicking a few babies first”, it’s not an excuse to keep kicking babies. i just have to stop writing, or maybe find another workflow

FaceDeer,
FaceDeer avatar

The difference is that kicking babies is illegal whereas training and running an AI is not. Kind of a big difference.

Zeus,

did you click the thing saying that you understand analogies?

FaceDeer,
FaceDeer avatar

You're using an analogy as the basis for an argument. That's not what analogies are for. Analogies are useful explanatory tools, but only within a limited domain. Kicking a baby is not the same as creating an artwork, so there are areas in which they don't map to each other.

You can't dodge flaws in your argument by adding a "don't respond unless you agree with me" clause on your comment.

Zeus, (edited )

You’re using an analogy as the basis for an argument. That’s not what analogies are for. Analogies are useful explanatory tools, but only within a limited domain

actually that’s exactly what i was using it for.

Kicking a baby is not the same[^1] as creating an artwork, so there are areas in which they don’t map to each other.

if you read carefully, you’ll see that writing is analogous to creating an artwork, and kicking a baby is analogous to doing something that someone has asked you not to, and you’re continuing anyways. if you read even more carefully, you’ll see that i implied i wasn’t making a moral comment on ai, piracy, or even kicking babies

You can’t dodge flaws in your argument by adding a “don’t respond unless you agree with me” clause on your comment.

i didn’t intend to. i did it so i wouldn’t have to waste my time arguing with those who don’t understand analogies. however i seem to be doing that anyways, so if you’ll excuse me, i’m going to stop


edit: okay, i’ve been reading the rest of this thread, and you clearly don’t understand analogy. i have no idea why you clicked on my comment

[^1]: yes. analogous doesn’t mean “the same”. it means "able to draw demonstrative parallels between

TwilightVulpine,

Not at the point of generation, but at the point of training it was. One of the sticking points of AI for artists is that their developers didn't even bother to seek permission. They simply said it was too much work and crawled artists' galleries.

Even publicly displayed art can only be used for certain previously-established purposes. By default you can't use them for derivative works.

FaceDeer,
FaceDeer avatar

At the point of training it was viewing images that the artists had published in a public gallery. Nothing pirated at that point either. They don't need "permission" to do that, the images are on display.

Learning from art is one of the previously-established purposes you speak of. No "derivative work" is made when an AI trains a model, the model does not contain any copyrightable part of the imagery it is trained on.

Kichae,

Bring publicly viewable doesn't make them public domain. Bring able to see something doesn't give you the right to use it for literally any other reason.

Full stop.

My gods, you're such an insufferable bootlicking fanboy of bullshit code jockies. Make a good faith effort to actually understand why people dislike these exploitative assholes who are looking to make a buck off of other people's work for once, instead of just reflexively calling them all phillistines who "just don't understand".

Some of us work on machine learning systems for a living. We know what they are and how they work, and they're fucking regurgitation machines. And people deserve to have control over whether we use their works in our regurgitation machines.

TwilightVulpine,

Of course they need permission to process images. No computer system can merely "view" an image without at least creating a copy for temporary use, and the purposes for which that can be done are strictly defined. Doing whatever you want just because you have access to the image is often copyright infringement.

People have the right to learn from images available publicly for personal viewing. AI is not yet people. Your whole argument relies on anthropomorphizing a tool, but it wouldn't even be able to select images to train its model without human intervention, which is done with the intent to replicate the artist's work.

I'm not one to usually bat for copyright but the disregard AI proponents have for artists' rights and their livelihood has gone long past what's acceptable, like the article shows.

FaceDeer,
FaceDeer avatar

If I run an image from the web through a program that generates a histogram of how bright its pixels are, am I suddenly a dirty pirate?

TwilightVulpine,

If you run someone's artwork through a filter is it completely fine and new just because the output is not exactly like the input and it deletes the input after it's done processing?

There is a discussion to be made, in good faith, of where the line lies, what ought to be the rights of the audience and what ought to be the rights of the artists, and what ought to be the rights of platforms, and what ought to be the limits of AI. To be fair, that's a difficult situation to determine, because in many aspects copyright is already too overbearing. Legally, many pieces of fan art and even memes are copyright infringement. But on the flipside automating art away is too far to the other side. The reason why Copyright even exists, at least ideally, is so that the rights and livelihood of artists is protected and they are incentivized to continue creating.

Lets not pretend that is just analysis for the sake of academic understanding, there is a large amount of people who are feeding artists' works into AI with the express purpose of getting artworks in their style without compensating them, something many artists made clear they are not okay with. While they can't tell people not to practice styles like theirs, they can definitely tell people not to use their works in ways they do not allow.

FaceDeer,
FaceDeer avatar

If you run someone's artwork through a filter is it completely fine and new just because the output is not exactly like the input and it deletes the input after it's done processing?

No, that's a derivative work. An analysis of the brightness of the pixels is not a derivative work.

There is a discussion to be made, in good faith, of where the line lies, what ought to be the rights of the audience and what ought to be the rights of the artists, and what ought to be the rights of platforms, and what ought to be the limits of AI.

Sure, but the people crying "You're stealing art!" are not making a good faith argument. They're using an inaccurate, prejudicial word for the purpose of riling up an emotional response. Or perhaps they just don't understand what copyright is and why it is, which also puts their argument in a bad state.

The reason why Copyright even exists, at least ideally, is so that the rights and livelihood of artists is protected and they are incentivized to continue creating.

Case in point. That's not why copyright exists. The reason for the American version of copyright is established right in the constitution: "To promote the progress of science and useful arts". If you want to go more fundamental than just what the US is up to, the original Statute of Anne was titled "An Act for the Encouragement of Learning".

The purpose of copyright is not to protect the rights or livelihood of artists. The protection of the rights and livelihood of artist is a means to the actual purpose of copyright, which is to enrich the public domain by prompting artists to be productive and to publish their works.

An artist that opposes AIs like these is now actively hindering the enrichment of the public domain.

Backspacecentury,

Wow.. so in your mind there is basically no copyright and nobody owns anything. That is incredibly reductive and completely ignores centuries of legal precedence since the constitution was written.

You are basically claiming that anything that is ever put on display anywhere, ever is public domain and that piracy doesn't exist.

FaceDeer,
FaceDeer avatar

No, I'm not claiming that and I have no idea how you're managing to come to that conclusion from what I wrote. There's no connection I can discern.

Kichae,

Because it's a required assumption to make anything you say on the subject make any sense. The fact that you deny that had convinced me that you're just a troll.

TwilightVulpine,

A histogram cannot output similar images, it's pointless to argue the fine details of an analogy that doesn't apply to begin with

To call it "stealing" might be inaccurate, but are the artists wrong to say that their intellectual property rights are being violated, when people using their works without consent to train AIs with the express purpose of replicating those artists' works? I have seen several artists pointing out AI users who brag to them that they are explicitly training AIs using those artists' galleries and show that it's outputting similar works.

The reason why Copyright even exists, at least ideally, is so that the rights and livelihood of artists is protected and they are incentivized to continue creating.

Case in point. That's not why copyright exists. The reason for the American version of copyright is established right in the constitution: "To promote the progress of science and useful arts".

How is it "promoting the progress of useful arts" not the same as "incentivizing artists to continue creating"? Are you going to argue what's "useful"? If there is interest in replicating artists' styles with AI, then that is an admission the people doing it see use in those works. Otherwise, it's the same, and protecting their livelihoods through the privilege of a temporary intellectual monopoly is how that promotion of arts is done.

I definitely see the value of the Public Domain, but if expanding it at any cost was the primary goal of copyright we wouldn't have roughly century-long copyright. Which I don't think is good per see but that's another discussion. Still, the existence of copyright at all is a concession that grants that for artists and creators to develop their works and ultimately enrich humanity's culture, they need to be able to control their works and have a guarantee to a stable career, to the extent that they can sell their own work. It's a protection so that not everyone can show up imitating that artist and undercut them, undermining their capability to make new creative works. Which is what many people have been doing with AI.

If anything that could enrich the Public Domain was enough reason to drop Copyright, we wouldn't have any Copyright. The compromise is that Public Domain as a whole will be enriched when the artist's Copyright expires.

FaceDeer,
FaceDeer avatar

They were not used for derivative works. The AI's model produced by the training does not contain any copyrighted material.

If you click this link and view the images there then you are just as much a "pirate" as the AI trainers.

TwilightVulpine,

The models themselves are the derivative works. Those artists' works were copied and processed to create that model. There is a difference between a person viewing a piece of work and putting that work to be processed through a system. The way copyright works as defined, being allowed to view a work is not the same as being allowed to use it in any way you see fit. It's also innacurate to speak of AIs as if they have the same abilities and rights as people.

Pulse,

Yes, it was.

One human artist can, over a life time, learn from a few artists to inform their style.

These AI setups are telling ALL the art from ALL the artists and using them as part of a for profit business.

There is no ethical stance for letting billion dollar tech firms hoover up all the art ever created to the try and remix it for profit.

FaceDeer,
FaceDeer avatar

No, it wasn't. Theft is a well-defined word. When you steal something you take it away from them so that they don't have it any more.

It wasn't even a case of copyright violation, because no copies of any of Rutkowski's art were made. The model does not contain a copy of any of the training data (with an asterisk for the case of overfitting, which is very rare and which trainers do their best to avoid). The art it produces in Rutkowski's style is also not a copyright violation because you can't copyright a style.

There is no ethical stance for letting billion dollar tech firms hoover up all the art ever created to the try and remix it for profit.

So how about the open-source models? Or in this specific instance, the guy who made a LoRA for mimicking Rutkowski's style, since he did it free of charge and released it for anyone to use?

Pulse,

Yes copies were made. The files were downloaded, one way or another (even as a hash, or whatever digital asset they claim to translate them into) then fed to their machines.

If I go into a Ford plant, take pictures of their equipment, then use those to make my own machines, it’s still IP theft, even if I didn’t walk out with the machine.

Make all the excuses you want, you’re supporting the theft of other people’s life’s work then trying to claim it’s ethical.

ricecake,

Copies that were freely shared for the purpose of letting anyone look at them.

Do you think it’s copyright infringement to go to a website?

Typically, ephemeral copies that aren’t kept for a substantial period of time aren’t considered copyright violations, otherwise viewing a website would be a copyright violation for every image appearing on that site.

Downloading a freely published image to run an algorithm on it and then deleting it without distribution is basically the canonical example of ephemeral.

storksforlegs,
@storksforlegs@beehaw.org avatar

Its what you do with the copies thats the problem, not the physical act of copying.

FaceDeer,
FaceDeer avatar

Yes copies were made. The files were downloaded, one way or another (even as a hash, or whatever digital asset they claim to translate them into) then fed to their machines.

They were put on the Internet for that very purpose. When you visit a website and view an image there a copy of it is made in your computer's memory. If that's a copyright violation then everyone's equally boned. When you click this link you're doing exactly the same thing.

M0RNlNGW00D,

For disclosure I am a former member of the American Photographic Artists/Advertising Photographers of America, and I have works registered at the United States Copyright Office.

When we put works in our online portfolio, send mailers or physical copies of our portfolios we're doing it as promotional works. There is no usage license attached to it. If loaded into memory for personal viewing, that's fine since its not a commercial application nor violating the intent of that specific release: viewing for promotion.

Let's break down your example to help you understand what is actually going on. When we upload our works to third party galleries there is often a clause in the terms of service which states the artist uploading to the site grants a usage license for distribution and displaying of the image. Let's look at Section 17 of ArtStation's Terms of Service:

  1. License regarding Your Content

Your Content may be shared with third parties, for example, on social media sites to promote Your Content on the Site, and may be available for purchase through the Marketplace. You hereby grant royalty-free, perpetual, world-wide, licenses (the “Licenses”) to Epic and our service providers to use, copy, modify, reformat and distribute Your Content, and to use the name that you provide in association with Your Content, in connection with providing the Services; and to Epic and our service providers, members, users and licensees to use, communicate, share, and display Your Content (in whole or in part) subject to our policies, as those policies are amended from time-to-time

This is in conjunction with Section 16's opening line:

  1. Ownership

As between you and Epic, you will retain ownership of all original text, images, videos, messages, comments, ratings, reviews and other original content you provide on or through the Site, including Digital Products and descriptions of your Digital Products and Hard Products (collectively, “Your Content”), and all intellectual property rights in Your Content.

So when I click your link, I'm not engaging in a copyright violation. I'm making use of ArtStation's/Epic's license to distribute the original artist's works. When I save images from ArtStation that license does not transfer to me. Meaning if I were to repurpose that work it could be a copyright violation depending on the usage the artist agrees to. Established law states that I hold onto the rights of my work and any usage depends on what I explicitly state and agree to; emphasis on explicitly because the law will respect my terms and compensation first, and your intentions second. For example, if a magazine uses my images for several months without a license, I can document the usage time frame, send them an invoice, and begin negotiating because their legal team will realize that without a license they have no footing.

  • Yes, this also applies to journalism as well. If you've agreed to let a news outlet use your works on a breaking story for credit/exposure, then you provided a license for fair compensation in the form of credit/exposure.

I know this seems strange given how the internet freely transformed works for decades without repercussions. But as you know from sites like YouTube copyright holders are not a fan of people repurposing their works without a mutually agreed upon terms in the form of a license. If you remember the old show Mystery Science Theater 3000, they operated in the proper form: get license, transform work, commercialize. In the case of ArtStation, the site agrees to provide free hosting in compensation for the artist providing a license to distribute the work without terms for monetization unless agreed upon through ArtStation's marketplace. At every step, the artist's rights to their work is respected and compensated when the law is applied.

If all this makes sense and we look back at AI art, well...

FaceDeer,
FaceDeer avatar

Meaning if I were to repurpose that work it could be a copyright violation depending on the usage the artist agrees to.

Training an AI doesn't "repurpose" that work, though. The AI learns concepts from it and then the work is discarded. No copyrighted part of the work remains in the AI's model. All that verbiage doesn't really apply to what's being done with the images when an AI trains on them, they are no longer being "used" for anything at all after training is done. Just like when a human artist looks at some reference images and then creates his own original work based on what he's learned from them.

TwilightVulpine,

Here is where a rhethorical sleight of hand is used by AI proponents.

It's displayed for people's appreciation. AI is not people, it is a tool. It's not entitled to the same rights as people, and the model it creates based on artists works is itself a derivative work.

Even among AI proponents, few believe that the AI itself is an autonomous being who ought to have rights over their own artworks, least of all the AI creators.

FaceDeer,
FaceDeer avatar

I use tools such as web browsers to view art. AI is a tool too. There's no sleight of hand, AI doesn't have to be an "autonomous being." Training is just a mechanism for analyzing art. If I wrote a program that analyzed pictures to determine what the predominant colour in them was that'd be much the same, there'd be no problem with me running it on every image I came across on a public gallery.

TwilightVulpine,

You wouldn't even be able to point a camera to works in public galleries without permission. Free for viewing doesn't mean free to do whatever you want with them, and many artists have made clear they never gave permission that their works would be used to train AIs.

Harrison,

Once you display an idea in public, it belongs to anyone who sees it.

Pulse,

By that logic I can sell anything I download from the web while also claiming credit for it, right?

Downloading to view != downloading to fuel my business.

FaceDeer,
FaceDeer avatar

No, and that's such a ridiculous leap of logic that I can't come up with anything else to say except no. Just no. What gave you that idea?

Pulse,

Because this thread was about the companies taking art feeding it into their machine a D claiming not to have stolen it.

Then you compared that to clicking a link.

FaceDeer,
FaceDeer avatar

Yes, because it's comparable to clicking a link.

You said:

By that logic I can sell anything I download from the web while also claiming credit for it, right?

And that's the logic I can't follow. Who's downloading and selling Rutkowski's work? Who's claiming credit for it? None of that is being done in the first place, let alone being claimed to be "ok."

Pulse,

Because that is what they’re doing, just with extra steps.

The company pulled down his work, fed it to their AI, then sold the AI as their product.

Their AI wouldn’t work, at all, without the art they “clicked on”.

So there is a difference between me viewing an image in my browser and me turning their work into something for resell under my name. Adding extra steps doesn’t change that.

FaceDeer,
FaceDeer avatar

The company pulled down his work, fed it to their AI, then sold the AI as their product.

If you read the article, not even that is what's going on here. Stability AI:

  • Removed Rutkowski's art from their training set.
  • Doesn't sell their AI as a product.
  • Someone else added Rutkowski back in by training a LoRA on top of Stability's AI.
  • They aren't selling their LoRA as a product either.

So none of what you're objecting to is actually happening. All cool? Or will you just come up with some other thing to object to?

Pulse,

But they did.

(I’m on mobile so my formatting is meh)

They put his art in, only when called out did they remove it.

Once removed, they did nothing to prevent it being added back.

As for them selling the product, or not, at this point, they still used the output of his labor to build their product.

That’s the thing, everyone trying to justify why it’s okay for these companies to do it keep leaning on semantics, legal definitions or “well, back during the industrial revolution…” to try and get around the fact that what these companies are doing is unethical. They’re taking someone else’s labor, without compensation or consent.

amju_wolf,
@amju_wolf@pawb.social avatar

No, but you can download Rutkovski’s art, learn from it how to paint in his exact style and create art in that style.

Which is exactly what the image generation AIs do. They’re perhaps just a bit too good at it, certainly way better than an average human.

Which makes it complicated and morally questionable depending on how exactly you arrive at the model and what you do with it, but you can’t definitively say it’s copyright infringement.

adespoton,

What makes it even trickier is that taking AI generated art and using it however you want definitively isn’t copyright infringement because only works by humans can be protected by copyright.

Pulse,

But that’s not what they did, converting it into a set of instructions a computer can use to recreate it is just adding steps.

And, yes, that’s what they’ve done else we wouldn’t find pieces of others works mixed in.

Also, even if that was how it worked, it’s still theft of someone’s else’s labor to feed your business.

If it wasn’t, they would have asked for permission first.

Pulse,

I think my initial reply to you was meant to go somewhere else but Connect keeps dropping me to the bottom of the thread instead of where the reply I’m trying to get to is.

I’m going to leave it (for consistency sake) but I don’t think it makes much sense as a reply to your post.

Sorry about that!

Pulse,

You keep comparing what one person, given MONTHS or YEARS of their life could do with one artists work to a machine doing NOT THE SAME THING can do with thousands of artists work.

The machine is not learning their style, it’s taking pieces of the work and dropping it in with other people’s work then trying to blend it into a cohesive whole.

The analogy fails all over the place.

And I don’t care about copyright, I’m not an artist or an IP lawyer, or whatever. I can just look at a company stealing the labor of an entire industry and see it as bad.

FaceDeer,
FaceDeer avatar

The speed doesn't factor into it. Modern machines can stamp out metal parts vastly faster than blacksmiths with a hammer and anvil can, are those machines doing something wrong?

Pulse,

The machine didn’t take the blacksmiths work product and flood the market with copies.

The machine wasn’t fed 10,000 blacksmith made hammers then told to, sorta, copy those.

Justify this all you want, throw all the bad analogies at it you want, it’s still bad.

Again, if this wasn’t bad, the companies would have asked for permission. They didn’t.

FaceDeer,
FaceDeer avatar

That's not the aspect you were arguing about in the comment I'm responding to. You said:

You keep comparing what one person, given MONTHS or YEARS of their life could do with one artists work to a machine doing NOT THE SAME THING can do with thousands of artists work.

And that's what I'm talking about here. The speed with which the machine does its work is immaterial.

Though frankly, if the machine stamping out parts had somehow "learned" how to do it by looking at thousands of existing parts, that would be fine too. So I don't see any problem here.

Pulse,

And that’s where we have a fundamental difference of opinion.

A company hiring an engineer to design a machine that makes hammers, then hiring one (or more) people to make the machine to then make hammers is the company benefiting from the work product of people they hired. While this may impact the blacksmith they did not steal from the blacksmith.

A company taking someone else’s work product to then build their product, without compensation or consent, is theft of labor.

I don’t see those as equitable situations.

FaceDeer,
FaceDeer avatar

At least now you're admitting that it's a difference of opinion, that's progress.

You think it should be illegal to do this stuff. Fine. I think copyright duration has been extended ridiculously long and should be a flat 30 years at most. But in both cases our opinions differ from what the law actually says. Right now there's nothing illegal about training an AI off of someone's lawfully-obtained published work, which is what was done here.

Pulse,

I’m not a fan of our copyright system. IMO, it’s far to long and should also include clauses that place anything not available for (easy) access in the public domain.

Also, I’m not talking about what laws say, should say or anything like that.

I’ve just been sharing my opinion that it’s unethical and I’ve not seen any good explanation for how stealing someone else’s labor is “good”.

TwilightVulpine,

Speed aside, machines don't have the same rights as humans do, so the idea that they are "learning like a person so it's fine" is like saying a photocopier machine's output ought to be treated as an independent work because it replicated some other work, and it's just so good and fast at it. AI's may not output identical work, but they still rely on taking an artist's work as input, something the creator ought to have a say over.

jarfil,

One human artist can, over a life time, learn from a few artists to inform their style.

These AI setups […] ALL the art from ALL the artists

So humans are slow and inefficient, what’s new?

First the machines replaced hand weavers, then ice sellers went bust, all the calculators got sacked, now it’s time for the artists.

There is no ethical stance for letting billion dollar tech firms hoover up all the art ever created to the try and remix it for profit.

We stand on the shoulders of generations of unethical stances.

Pulse,

“other people were bad so I should be bad to.”

Cool.

storksforlegs,
@storksforlegs@beehaw.org avatar

Yes, which is why we should try to do better.

kitonthenet,

what I'm getting from all the AI stuff is the people in charge and the people that use it are scumbags

kboy101222,

Welcome to the wonderful world of the silicon valley tech era! Everything must be profitable at all costs! Everything must steal every tiny fact about you! Everything must include ! Everything must go through enshittification!

MossyFeathers, (edited )

Pretty much. There are ways of using it that most artists would be okay with. Most of the people using it flat out refuse to use it like that though.

Edit: To expand on this:

Most artists would be okay with AI art being used as reference material, inspiration, assisting with fleshing out concepts (though you should use concept artists for that in a big production), rapid prototyping and whatnot. Most only care that the final product is at least mostly human-made.

Artists generally want you to actually put effort into what you’re making because, at the end of the day, typing a prompt into stable diffusion has more in common with receiving a free commission from an artist than it has with actually being an artist. If you’re going to claim that something AI had a hand in as being your art, then you need to have done the majority of the work on it yourself.

The most frustrating thing to me, however, is that there are places in art that AI could participate in which would send artists over the moon, but it’s not flashy so no one seems to be working on making AI in those areas.

Most of what I’m personally familiar with has to do with 3d modeling, and in that discipline, people would go nuts if you released an AI tool that could do the UV work for you. Messing with UVs can be very tedious and annoying, to the point where most artists will just use a tool using conventional algorithms to auto-unwrap and pack UVs, and then call it a day, even if they’re not great.

Another area is in rigging and weight painting. In order to animate a model, you have to rig it to a skeleton (unless you’re a masochist or trying to make a game accurate to late 90s-early 00s animation), paint the bone weights (which bones affect which polygons, and by how much), add constraints, etc. Most 3d modelers would leap at the prospect of having high-quality rigging and UVs done for them at the touch of a button. However, again, because it’s not flashy to the general public, no one’s put any effort into making an AI that can do that (afaik at least).

Finally, even if you do use an AI in ways that most artists would accept as valid, you’ll still have to prove it because there are so many people who put a prompt into stable diffusion, do some minor edits to fix hands (in older version), and then try to pass it off as their own work.

DekkerNSFW,

Sadly, AI isn't as good with sparse data like vertices and bones, so most attempts to use AI on 3D stuff is via NERFs, which is closer to a "photo" you can walk around in than to an actual 3D scene.

AzureDusk10,

The real issue here is the transfer of power away from the artist. This artist has presumably spent years and years perfecting his craft. Those efforts are now being used to line someone else’s pockets, in return for no compensation and a diminishment in the financial value of his work, and, by the sounds of it, little say in the matter either. That to me seems very unethical.

millie,

Personally, as an artist who spends the vast majority of their time on private projects that aren’t paid, I feel like it’s put power in my hands. It’s best at sprucing up existing work and saving huge amounts of time detailing. Because of stable diffusion I’ll be able to add those nice little touches and flashy bits to my work that a large corporation with no real vision has at their disposal.

To me it makes it much easier for smaller artists to compete, leveling the playing field a bit between those with massive resources and those with modest resources. That can only be a good thing in the long run.

But I also feel like copyright more often than not rewards the greedy and stifles the creative.

moon_matter,
moon_matter avatar

But that's sort of the nature of the beast when you put your content up for free on a public website. Does Kbin or Beehaw owe us money for our comments on this thread? What about everyone currently reading? At least KBin and Beehaw are making profit off of this.

The argument is not as clear cut as people are making it sound and it has potential to up-end some fundamental expectations around free websites and user-generated content. It's going to affect far more than just AI.

jarfil,

At least KBin and Beehaw are making profit off of this.

How?

RygelTheDom,

What blurry line? An artist doesn’t what his art stolen from him. Seems pretty cut and dry to me.

falsem,

If I look at someone's paintings, then paint something in a similar style did I steal their work? Or did I take inspiration from it?

Pulse,

No, you used it to inform your style.

You didn’t drop his art on to a screenprinter, smash someone else’s art on top, then try to sell t-shirts.

Trying to compare any of this to how one, individual, human learns is such a wildly inaccurate way to justify stealing a someone’s else’s work product.

falsem,

If it works correctly it's not a screenprinter, it's something unique as the output.

Pulse,

The fact that folks can identify the source of various parts of the output, and that intact watermarks have shown up, shows that it doesn’t work like you think it does.

jarfil,

Does that mean the AI is not smart enough to remove watermarks, or that it’s so smart it can reproduce them?

falsem,

It means that it's stupid enough that it reproduces them - poorly.

Swedneck,
@Swedneck@discuss.tchncs.de avatar

It’s like staring yourself blind at artworks with watermarks until you start seeing artworks with blurry watermarks in your dreams

TheBurlapBandit,

It’s not smart or stupid. It does what it’s been trained on, nothing more.

nickwitha_k,

LLMs and directly related technologies are not AI and possess no intelligence or capability to comprehend, despite the hype. So, they are absolutely the former, though it’s rather like a bandwagon sort of thing (x number of reference images had a watermark, so that’s what the generated image should have).

jarfil,

LLMs […] no intelligence or capability to comprehend

That’s debatable. LLMs have shown emergent behaviors aside from what was trained, and they seem to be capable of comprehending relationships between all sorts of tokens, including multi-modal ones.

Anyway, Stable diffusion is not an LLM, it’s more of a “neural network hallucination machine” with some cool hallucinations, that sometimes happen to be really close to some or parts of the input data. It still needs to be “smart” enough to decompose the original data into enough and the right patterns, that it can reconstruct part of the original from the patterns alone.

nickwitha_k,

Thanks for the clarification!

LLMs have indeed shown interesting behaviors but, from my experience with the technology and how it works, I would say that any claims of intelligence being possessed by a system that is only an LLM would be suspect and require extraordinary evidence to prove that it is not mistaken anthropomorphizing.

jarfil,

I don’t think an LLM alone can be intelligent… but I do think it can be the central building block for a sentient self-aware intelligent system.

Humans can be thought of as being made of a set of field-specific neural networks, tied together by a looping self-evaluating multi-modal LLM that we call “conscience”. The ability of an LLM to consume its own output, is what allows it to be used as the conscience loop, and current LLMs being trained on human language with all its human nuance, is an extra bonus.

Probably some other non-text multi-modal neural networks capable of consuming their own output could also be developed and be put in a loop, but right now we have LLMs, and we kind of understand most of what they’re saying, and they kind of understand most of what we’re saying, so that makes communication easier.

I mean, it is anthropomorphizing, but in this case I think it makes sense because it’s also anthropogenic, since these human language LLMs get trained on human language.

nickwitha_k,

Absolutely agreed with most of that. I think that LLMs and similar technologies are incredible and have great potential to be components of artificial intelligences. LLMs by themselves are more akin to “virtual intelligences” portrayed in the Mass Effect games, but currently generally with fewer guard rails to prevent hallucinations.

I suspect there may be a few other concurrent “loops”, likely not as well compared to LLMs (though some might be) running in our meat computers and their inefficiency and poor fidelity likely ends up being part of the factors that make our consciousness. Otherwise, your approximation makes a lot of sense. Still a lot to learn about our meat computers but, I really do hope we, as a species, succeed in making the world a bit less lonely (by helping other intelligence emerge).

jarfil,

There is some discussion about people “with an internal monologue”, and people “without”. I wonder if those might be some different ways of running that loop, or maybe some people have one loop take over others… and the whole “dissociative personality disorder” could be multiple loops competing for being the main one at different times.

Related to fidelity, some time ago I read an interesting thing: consciousness means having brainwaves out of sync, when they get in sync people go unconscious. From a background in electronics, I’ve always assumed the opposite (system clock and such), but apparently our consciousness emerges from the asynchronous differences, meaning the inefficiencies and poor fidelity might be a feature, not a bug.

Anyway, right now, as someone suffering from insomnia, I’d happily merge with some AI just to get a “pause” button.

FaceDeer,
FaceDeer avatar

They can't, and "intact" watermarks don't show up. You're the one who is misunderstanding how this works.

When a pattern is present very frequently the AI can learn to imitate it, resulting in things that closely resemble known watermarks. This is called "overfitting" and is avoided as much as possible. But even in those cases, if you examine the watermark-like pattern closely you'll see that it's usually quite badly distorted and only vaguely watermark-like.

Pulse,

Yes, because “imitate” and “copy” are different things when stealing from someone.

I do understand how it works, the “overfitting” was just laying clear what it does. It copies but tries to sample things in a way that won’t look like clear copies. It had no creativity, it is trying to find new ways of making copies.

If any of this was ethical, the companies doing it would have just asked for permission. That they didn’t says a everything you need to know.

I don’t usually have these kinds discussions anymore, I got tired of conversations like this back in 2016, when it became clear that people will go to the ends of the earth to justify unethical behavior as long as the people being hurt by it are people they don’t care about.

FaceDeer,
FaceDeer avatar

And we're back to you calling it "stealing", which it certainly is not. Even if it was copyright violation, copyright violation is not stealing.

You should try to get the basic terminology right, at the very least.

Pulse,

Just because you’ve redefined theft in a way that makes you feel okay about it doesn’t change what they did.

They took someone else’s work product, fed it into their machine then used that to make money.

They stole someone’s labor.

FaceDeer,
FaceDeer avatar

I haven't "redefined" it, I'm using the legal definition. People do sometimes sloppily equate copyright violation with theft in common parlance, but they're in for a rude awakening if they intend to try translating that into legal action.

Using that term in an argument like this is merely trying to beg the question of whether it's wrong, since most everyone agrees that stealing is wrong you're trying to cast the action of training an AI as something everyone will by default agree is wrong. But it's not stealing, no matter how much you want it to be, and I'm calling that rhetorical trick out here.

If you want to argue that it's wrong you need to argue against the actual process that's happening, not some magical scenario where the AI trainers are somehow literally robbing people.

Pulse,

Taking someone’s work product and converting it, without compensation and consent, into your profit is theft of labor.

Adding extra steps, like, say, training an AI, doesn’t absolve the theft of labor.

We’re it ethical, the companies doing it would have asked for permission and been given cinsent. They didn’t.

FaceDeer,
FaceDeer avatar

Taking someone’s work product and converting it, without compensation and consent, into your profit is theft of labor.

That's not what's going on here. The finished product contains only the style of the artist that the AI was trained on, and style is not copyrightable. Which is a damn good thing, as humans have been learning from each other's "work products" and mimicking each others' styles since time immemorial.

BTW, theft of labor means failing to pay wages or provide employee benefits owed to an employee by contract or law. You're using that term incorrectly too, Greg Rutkowski wasn't hired to do anything for the people who trained the AI off of his work.

Pulse,

No, I’m not using it incorrectly, I’m just not concerned with the legal definition as I’m not a lawyer or anyone tied up in this mess.

If you do a thing, and it takes time and skill to do it, then someone copies it, they stole your labor.

Saying they “copied his style”, the style he spent a lifetime crafting, then trying to say they didn’t benefit, at no cost, to the labor he put into crafting that style because “well actually, the law says…” is a bad argument as it tries to minimize what they did.

If their product could not exist without his labor, and they did not pay him for that labor, they stole his labor.

For, like, the fourth time in this thread: were this ethical, they would have asked for permission, they didn’t.

FaceDeer,
FaceDeer avatar

If you're just going to make up the meanings of words there's not much point in using them any further.

Pulse,

But I’m not.

You’re trying to say that, because this one law doesn’t say it’s bad it must therefore be good (or at least okay).

I’m simply saying that if you profit from someone else’s labor, without compensating them (or at least getting their consent), you’ve stolen the output of that labor.

I’m happy to be done with this, I didn’t expect my first Lemmy comment to get any attention, but no, I’m not going to suddenly be okay with this just because the legal definition of “stealing labor” is to narrow to fit this scenario.

whelmer,

The law doesn’t even say it’s okay. What FaceDeer is referring to is that copyright infringement is a different category of crime than theft, which is defined as pertaining to physical property. It’s a meaningless point because, as you said, this isn’t a courtroom and we aren’t lawyers and the concept of intellectual property theft is well understood.

It’s a thing engineers and lawyers often seem to do, to take the way terms are used in a particular professional jargon and assume that that usage is “the real” usage.

fades,

I don’t disagree but stolen is a bit of a stretch

teichflamme,

Nothing was stolen.

Drawing inspiration from someone else by looking at their work has been around for centuries.

Imagine if the Renaissance couldn’t happen because artists didn’t want their style stolen.

FaceDeer,
FaceDeer avatar

His art was not "stolen."

KoboldCoterie,
@KoboldCoterie@pawb.social avatar

I don’t fully understand how this works, but if they’ve created a way to replicate his style that doesn’t involve using his art in the model, how is it problematic? I understand not wanting models to be trained using his art, but he doesn’t have exclusive rights to the art style, and if someone else can replicate it, what’s the problem?

This is an honest question, I don’t know enough about this topic to make a case for either side.

Hubi,

You’re pretty spot on. It’s not much different from a human artist trying to copy his style by hand but without reproducing the actual drawings.

delollipop,

Do you know how they recreated his style? I couldn’t find such information or frankly have enough understanding to know how.

But if they either use his works directly or works created by another GAI with his name/style in the prompt, my personal feeling is that would still be unethical, especially if they charge money to generate his style of art without compensating him.

Plus, I find that the opt-out mentality really creepy and disrespectful

“If he contacts me asking for removal, I’ll remove this.” Lykon said. “At the moment I believe that having an accurate immortal depiction of his style is in everyone’s best interest.”

fsniper,

I still have trouble understanding the distinction between "a human consuming different artists, and replicating the style" vs "software consuming different artists, and replicating the style".

Otome-chan,
Otome-chan avatar

there's no distinction. people are just robophobic.

KoboldCoterie,
@KoboldCoterie@pawb.social avatar

Do you know how they recreated his style? I couldn’t find such information or frankly have enough understanding to know how.

I don’t, but another poster noted that it involves using his art to create the LoRA.

Plus, I find that the opt-out mentality really creepy and disrespectful

I don’t know about creepy and disrespectful, but it does feel like they’re saying “I know the artist doesn’t want me to do this, but if he doesn’t specifically ask me personally to stop, I’m going to do it anyway.”

averyminya,

But if they either use his works directly or works created by another GAI with his name/style in the prompt, my personal feeling is that would still be unethical, especially if they charge money to generate his style of art without compensating him.

LORA’s are created on image datasets, but these images are just available anywhere. It’s really not much different from you taking every still of The Simpsons and using it. What I don’t understand is how these are seen as problematic because a majority of end users utilizing AI are doing it under fair use.

No one charges for LORA’s or models AFAIK. If they do, it hasn’t come across the Stable Diffusion discords I moderate.

People actually selling AI generated art is also a different story and that’s where it falls outside of fair use if the models being used contain copy-written work. It seems pretty cut and dry, artists complained about not being emulated by other artists before AI so it’s only reasonable that it happens again. If people are profiting off it, it should be at least giving compensation to the original artist (if it could be adjusted so that per-token payments are given as royalties to the artist). However, on the other hand think about The Simpsons, or Pokemon, or anything that has ever been sold as a sticker/poster/display item.

I’m gonna guess that a majority of people have no problem with that IP theft cause it’s a big company. Okay… so what if I love Greg but he doesn’t respond to my letters and e-mails begging him to commission him for a Pokemon Rutkowski piece? Under fair use there’s no reason I can’t create that on my own, and if that means creating a dataset of all of his paintings that I paid for to utilize it then it’s technically legal.

The only thing here that would be unethical or illegal is if his works are copywritten and being redistributed. They aren’t being redistributed and currently copy-written materials aren’t protected from being used in AI models, since the work done from AI can’t be copywritten. In other words, while it may be disrespectful to go against the artists wishes to not be used in AI, there’s no current grounds for it other than an artist not wanting to be copied… which is a tale as old as time.

TL;DR model and LORA makers aren’t charging, users can’t sell or copywrite AI works, and copywritten works aren’t protected from being used in AI models (currently). An artist not wanting to be used currently has no grounds other than making strikes against anything that is redistributing copies of their work. If someone is using this LORA to recreate Greg Rutkowski paintings and then proceeds to give or sell them then the artist is able to claim that there’s theft and damages… but the likelihood of an AI model being able to do this is low. The likelihood of someone selling these is higher, but from my understanding artistic styles are pretty much fair game anyway you swing it.

I understand wanting to protect artists. Artists also get overly defensive at times - I’m not saying that this guy is I actually am more on his side than my comment makes it out, especially after how he was treated in the discord I moderate. I’m more just pointing out that there’s a slippery slope both ways and the current state of U.S. law on it.

SweetAIBelle,
SweetAIBelle avatar

Generally speaking, the way training works is this:
You put together a folder of pictures, all the same size. It would've been 1024x1024 in this case. Other models have used 768z768 or 512x512. For every picture, you also have a text file with a description.

The training software takes a picture, slices it into squares, generates a square the same size of random noise, then trains on how to change that noise into that square. It associates that training with tokens from the description that went with that picture. And it keeps doing this.

Then later, when someone types a prompt into the software, it tokenizes it, generates more random noise, and uses the denoising methods associated with the tokens you typed in. The pictures in the folder aren't actually kept by it anywhere.

From the side of the person doing the training, it's just put together the pictures and descriptions, set some settings, and let the training software do its work, though.

(No money involved in this one. One person trained it and plopped it on a website where people can download loras for free...)

Rhaedas,
Rhaedas avatar

they charge money to generate his style of art without compensating him.

That's really the big thing, not just here but any material that's been used to train on without permission or compensation. The difference is that most of it is so subtle it can't be picked out, but an artist style is obviously a huge parameter since his name was being used to call out those particular training aspects during generations. It's a bit hypocritical to say you aren't stealing someone's work when you stick his actual name in the prompt. It doesn't really matter how many levels the art style has been laundered, it still originated from him.

conciselyverbose,

It is unconditionally impossible to own an artistic style. "Stealing a style" cannot be done.

snooggums,
snooggums avatar

Is drawing Mickey Mouse in a new pose copying the style or copying Mickey Mouse?

conciselyverbose,

The second.

I'm not sure how that's relevant here, though. There is nothing at all being copied but an aesthetic.

ricecake,

You said it yourself. You’re drawing Micky mouse in a new pose, so you’re copying Mickey mouse.

Drawing a cartoon in the style of Mickey mouse isn’t the same thing.

You can’t have a copyright on “big oversized smile, exaggerated posture, large facial features, oversized feet and hands, rounded contours and a smooth style of motion”.

Rhaedas,
Rhaedas avatar

And yet the artist's name is used to push the weights towards pictures in their style. I don't know what the correct semantics are for it, nor the legalities. That's part of the problem, the tech is ahead of our laws, as is usually the case.

conciselyverbose,

And yet the artist's name is used to push the weights towards pictures in their style.

That's not even vaguely new in the world of art.

Imitating style is the core of what art is. It's absolutely unconditionally protected by copyright law. It's not even a .01 out of 10 on the scale of unethical. It's what's supposed to happen.

The law might not cover this yet, but any law that restricts the fundamental right to build off of the ideas of others that are the core of the entirety of human civilization is unadulterated evil. There is no part of that that could possibly be acceptable to own.

Rhaedas,
Rhaedas avatar

I totally agree with you on protecting the basics of creativity and growth. I think the core issue is using "imitate" here. Is that what the LLM is doing, or is that an anthropomorphism of some sense that there's intelligence guiding the process? I know it seems like I'm nitpicking things to further my point, but the fact that this is an issue to many even outside artwork says there is a question here of what is and isn't okay.

conciselyverbose,

The AI is not intelligent. That doesn't matter.
Nothing anyone owns is being copied or redistributed. The creator isn't the tool; it's the person using the tool.

AI needs two things to work, an algorithm and data. If training is allowed to anyone, anyone can create their own algorithms and use the AI as a tool to create innovative new messages with some ideas borrowed from other work.

If data is proprietary, they cannot. But Disney still can. They'll just as successfully flood out all the artists who can't use AI because they don't have a data set, but now they and the two other companies in the world who own IP are basically a monopoly (or tri- or whatever) and everyone else is screwed.

altima_neo,
@altima_neo@lemmy.zip avatar

It’s only using his name because the person who created the LORA trained it with his name. They could have chosen any other word.

Rhaedas,
Rhaedas avatar

True, and then because it's a black box there wouldn't be a known issue at all. Or maybe it would be much less of an issue because the words might have blended others into the mix, and his style wouldn't be as obvious in the outputs, and/or it would be easier to dismiss. Did the training involve actual input of his name, or was that pulled from the source trained on? How much control was in the training?

Peanutbjelly, (edited )

Just wait until you can copywrite a style. Guess who will end up owning all the styles.

Spoiler, it’s wealthy companies like Disney and Warner. Oh you used cross hatching? Disney owns the style now you theif.

Copyright is fucked. Has been since before the Mickey mouse protection act. Our economic system is fucked. People would rather fight each other and new tools instead of rallying against the actual problem, and it’s getting to me.

Pseu,

You’re right, copyright won’t fix it, copyright will just enable large companies to activate more of their work extract more from the creative space.

But who will benefit the most from AI? The artists seem to be getting screwed right now, and I’m pretty sure that Hasbro and Disney will love to cut costs and lay off artists as soon as this blows over.

Technology is capital, and in a capitalist system, that goes to benefit the holders of that capital. No matter how you cut it, laborers including artists are the ones who will get screwed.

TheBurlapBandit,

Me, I’ll benefit the most. I’ve been using a locally running instance of the free and open source AI software Stable Diffusion to generate artwork for my D&D campaigns and they’ve never looked more beautiful!

FaceDeer,
FaceDeer avatar

Same here. It's awesome being able to effectively "commission" art for any random little thing the party might encounter. And sometimes while generating images there'll be surprising details that give me new ideas, too. It's like brainstorming with ChatGPT but in visual form.

arvere,

my take on the subject, as someone who worked both in design and arts, and tech, is that the difficulty in discussing this is more rooted on what is art as opposed to what is theft

we mistakingly call illustrator/design work as art work. art is hard to define, but most would agree it requires some level of expressiveness that emanates from the artist (from the condition of the human existence, to social criticism, to beauty by itself) and that’s what makes it valuable. with SD and other AIs, the control of this aspect is actually in the hands of the AI illustrator (or artist?)

whereas design and illustration are associated with product development and market. while they can contain art in a way, they have to adhere to a specific pipeline that is generally (if not always) for profit. to deliver the best-looking imagery for a given purpose in the shortest time possible

designers and illustrators were always bound to be replaced one way or a another, as the system is always aiming to maximize profit (much like the now old discussions between taxis and uber). they have all the rights to whine about it, but my guess is that this won’t save their jobs. they will have to adopt it as a very powerful tool in their workflow or change careers

on the other hand, artists that are worried, if they think the worth of their art lies solely in a specific style they’ve developed, they are in for an epiphany. they might soon realise they aren’t really artists, but freelance illustrators. that’s also not to mention other posts stating that we always climb on the shoulders of past masters - in all areas

both artists and illustrators that embrace this tool will benefit from it, either to express themselves quicker and skipping fine arts school or to deliver in a pace compatible with the market

all that being said I would love to live in a society where people cared more about progress instead of money. imagine artists and designers actively contributing to this tech instead of wasting time talking fighting over IP and copyright…

Harrison,

Artists don’t own their styles, so it’s interesting to see them fight to protect them.

The only thing that makes anything valuable is that someone wants it, or at least wants it to exist. Nothing has intrinsic value because value itself is a human construction. This necessarily includes art.

itsgallus,

Artists should own their styles, but only in combination with their name. Forgery has always been a problem, but it’s obviously a lot more accessible thanks to AI. As a hobbyist artist myself, I don’t see monetary value as the main problem, but rather misrepresentation. Feel free to copy my style, but don’t attribute your art to me — AI generated or otherwise.

That being said, I’m super excited about this evolution of technology.

shagie,

Artists should own their styles, but only in combination with their name.

Consider how many of the small, independent artists produce art with the intentional style of Disney.

Styles being something subject to protection would probably be disastrous to all but the biggest names (who could hire lawyers).

LSNLDN,

Maybe I’m missing something here but isn’t Disney a great example of a style having ownership? One that Disney aggressively defend too. Difference being an individual person doesn’t have the resources of all of Disney so they can’t do much to defend their art… idk i’m rambling.

shagie,

Disney doesn’t defend the style. They defend the use of the characters. You can find countless pieces of. fan art (draw X in the style of Disney) that haven’t been sued over.

Things like r/learntodraw : Trying to get better at Disney’s style (yea, reddit) aren’t infringing.

But I can assure you that if style was something that could be protected, then there’d be a great deal of amateur and fan content that is currently produced by small time artists that wouldn’t be able to anymore. … And the “you copied my style” would mean more than internet bragging points.

LSNLDN,

Ah I get it

storksforlegs, (edited )
@storksforlegs@beehaw.org avatar

There’s a lot of disagreement here on what is theft, what is art, what is copyright… etc

The main issue people have with AI is fundamentally how is it going to be used? I know there isnt much we can do about it now, and its a shame because there it has so much potential good. Everyone defending AI is making a lot of valid points.

But at the end of the day it is a tool that is going to be misused by the rich and powerful to eliminate hundreds of millions of well paying careers, permanently. MOST well paying jobs in fact, not just artists. What the hell are people supposed to do? How is any of this a good thing?

sapient_cogbag,
@sapient_cogbag@infosec.pub avatar

What the hell are people supposed to do?

Eat the rich :)

More concretely, there are a number of smaller and larger sociopolitical changes that can be fought for. On the smaller side, there’s rethinking the way our society values people and pushing for some kind of UBI, on the larger side there’s shifting to postcapitalist economics and organisation to various degrees ^.^)

boff,

But the rich are the ones buying a lot of the art! Who will pay the artists if you eat the people with the money?

Harrison,

The rich and powerful must go away, or everyone else will suffer.

Soon enough they will succeed in eliminating most jobs, and the moment will come where action must be taken. Them or us.

Sandra,

The copyright argument is a bad argument against AI art. But there are also good arguments against it.

Steeve,

This person has no idea what machine learning actually is. And they hate such a generic concept on a “gut feeling” and come up with the reasons later?

If you want good reasons to hate AI generated art you won’t find them in this shitty blogpost.

liminalDeluge,

Apparently your comment really got to them, because the blogpost now contains a direct quote of you and a response.

Steeve, (edited )

Someone I don’t get along with very well wrote:

Hahaha yikes. Pretty cowardly to post their unhinged response on their blog where nobody can actually respond.

Also, why the hell would this person who hates the very general concept of machine learning (because of their gut lol) get a degree in a field that significantly utilizes machine learning? Computational linguistics is essentially driven by machine learning, so that’s uh… probably bullshit.

Thrashy,
@Thrashy@beehaw.org avatar

as a counterpoint, when the use-case for the tool is specifically “I want a picture that looks like it was painted by Greg Rutkowski, but I don’t want to pay Greg Rutkowski to paint it for me” that sounds like the sort of scenario that copyright was specifically envisioned to protect against – and if it doesn’t protect against that, it’s arguably an oversight in need of correction. It’s in AI makers and users’ interest to proactively self-regulate on this front, because if they don’t somebody like Disney is going to wade into this at some point with expensive lobbyists, and dictate the law to their own benefit.

That said, it’s working artists like Rutkowski, or friends of mine who scrape together a living off commissioned pieces, that I am most concerned for. Fantasy art like Greg makes, or personal character portraits of the sort you find on character sheets of long-running DnD games or as avatar images on forums like this one, make up the bread and butter of many small-time artists’ work, and those commissions are the ones most endangered by the current state of the art in generative AI. It’s great for would-be patrons that the cost of commissioning a mood piece for a campaign setting or a portrait of their fursona has suddenly dropped to basically zero, but it sucks for artists that their lunch is being eaten by an AI algorithm that was trained by slurping up all their work without compensation or even credit. For as long as artists need to get paid for their work in order to live, that’s inherently anti-worker.

Harrison,

It sucked for candle makers when electric lights were adopted. It sucked for farriers and stable hands and saddle makers when cars became affordable for the average person. Such is the cost of progress.

Sandra,

I’m also an artist, for whatever that’s worth, 🤷🏻‍♀️

Copyright is artificial scarcity which is ultimately designed for publishers, not workers.

One of the many, many bugs in market capitalism is that it can’t handle when something is difficult to initially create but when copies are cheap. Like a song. It’s tricky to write it but once you have it you can copy it endlessly. Markets based on supply and demand can’t handle that so they cooked up copyright as kind of a brutal patch, originally for book publishers in an era where normal readers couldn’t easily copy books anyway, only other publishers could.

It’s a patch that doesn’t work very well since many artist still work super hard and still have to get by on scraps. Ultimately we need to re-think a lot of economics. Not only because digital threw everything on its ear and what could’ve been a cornucopia is now a tug of war for pennies, but also because of climate change (which is caused by fossil fuel transaction externalities being under-accounted for—if I sell you a can of gas, the full environmental impact of that is not going to be factored in properly. Sort of like how a memory leak works in a computer program).

I definitively sympathize with your artist friends and I’ve been speaking out against AI art, at least some aspects of it (including, but not limited to, the environmental impact of new models, and the increasing wealth&power concentration for big data capital).

CapedStanker,

Here’s my argument: tough titties. Everything Greg Rutkowski has ever drawn or made has been inspired by other things he has seen and the experiences of his life, and this applies to all of us. Indeed, one cannot usually have experiences without the participation of others. Everyone wants to think they are special, and of course we are to someone, but to everyone no one is special. Since all of our work is based upon the work of everyone who came before us, then all of our work belongs to everyone. So tough fucking titties, welcome to the world of computer science, control c and control v is heavily encouraged.

In that Beatles documentary, Paul McCartney said he thought that once you uttered the words into the microphone, it belonged to everyone. Little did he know how right he actually was.

You think there is a line between innovation and infringement? Wrong, They are the same thing.

And for the record, I’m fine with anyone stealing my art. They can even sell it as their own. Attribution is for the vain.

storksforlegs,
@storksforlegs@beehaw.org avatar

You’re fine with someone stealing your art and selling it?

hglman,

Greg wants to get paid, remove the threat of poverty from the loss of control and its a nonissue.

Virulent,

Not every human activity deserves compensation

hglman,

Compensation shouldn’t be an aspect of most human activity.

CallumWells,

But every human activity desirable to others deserve compensation. If you want someone to do something for you or make something for you or entertain you then it deserves compensation. The way ads on the internet have trained a lot of people to think that a lot of entertainment et cetera on the internet is free has been a negative for this. But at the same time that ad-supported model does make it more available to people that otherwise couldn’t afford the price of admission. It’s partly democratizing, but it’s also a scourge.

Virulent,

Even if that were true it wouldn’t apply to this situation. The man wants monopoly rights to his art style. That’s insane.

CallumWells,

Has he said that no other humans could be inspired by his art style? If no then he hasn’t expressed a want for monopoly rights to his art style. But he has expressed that he doesn’t want computers to generate art explicitly to mimic his art style.

Also don’t make claims that are totally disconnected from the argument discussed. It’s dishonest discourse and serves as a way to brush aside the other argument. You didn’t make any counterargument to my argument and the point of this chain which came from you saying that “Not every human activity deserves compensation” as a reply to someone saying “Greg wants to get paid, remove the threat of poverty from the loss of control and its [sic] a nonissue.”

Your reply to me was inane.

ultratiem,
@ultratiem@lemmy.ca avatar

A sad fact but undeniable truth. I work in the industry. It’s standard for us to do mood boards. I have a lot hate relationship with them because it can be helpful to hone the design to a client’s liking and get your bearings. But, the fact it’s essentially what AI is doing by “borrowing” existing art as a reference. It’s the exact same thing. And that’s why I hate doing it. Because I don’t want to take someone’s button or background pattern.

Regardless of how I feel, I still can’t recognize AI as being “stealing” but industry accepted practices that do the exact same thing aren’t?

ParsnipWitch,

I think people forget the reality when they take their supposedly brave and oh so altruistic stance of “there should be no copyright”.

When people already know they won’t even have a small chance of getting paid for the art they create, we will run out of artists.

Because most can not afford to learn and practice that craft without getting any form of payment. It will become a very rare hobby of a few decadent rich people who can afford to learn something like illustration in their free time.

Harrison,

Art is a part of the human condition. Whether or not it can be commercialised, it will endure as a past-time, just not as a vocation.

smart_boy,

If a company stole your art and copyrighted it such that it no longer belonged to everyone, in the same way that a Beatles record cannot be freely and openly shared, would you be fine with that?

fwygon, (edited )

AI art is factually not art theft. It is creation of art in the same rough and inexact way that we humans do it; except computers and AIs do not run on meat-based hardware that has an extraordinary number of features and demands that are hardwired to ensure survival of the meat-based hardware. It doesn’t have our limitations; so it can create similar works in various styles very quickly.

Copyright on the other hand is, an entirely different and, a very sticky subject. By default, “All Rights Are Reserved” is something that usually is protected by these laws. These laws however, are not grounded in modern times. They are grounded in the past; before the information age truly began it’s upswing.

Fair use generally encompasses all usage of information that is one or more of the following:

  • Educational; so long as it is taught as a part of a recognized class and within curriculum.
  • Informational; so long as it is being distributed to inform the public about valid, reasonable public interests. This is far broader than some would like; but it is legal.
  • Transformative; so long as the content is being modified in a substantial enough manner that it is an entirely new work that is not easily confused for the original. This too, is far broader than some would like; but it still is legal.
  • Narrative or Commentary purposes; so long as you’re not copying a significant amount of the whole content and passing it off as your own. Short clips with narration and lots of commentary interwoven between them is typically protected. Copyright is not intended to be used to silence free speech. This also tends to include satire; as long as it doesn’t tread into defamation territory.
  • Reasonable, ‘Non-Profit Seeking or Motivated’ Personal Use; People are generally allowed to share things amongst themselves and their friends and other acquaintances. Reasonable backup copies, loaning of copies, and even reproduction and presentation of things are generally considered fair use.

In most cases AI art is at least somewhat Transformative. It may be too complex for us to explain it simply; but the AI is basically a virtual brain that can, without error or certain human faults, ingest image information and make decisions based on input given to it in order to give a desired output.

Arguably; if I have license or right to view artwork; or this right is no longer reserved, but is granted to the public through the use of the World Wide Web…then the AI also has those rights. Yes. The AI has license to view, and learn from your artwork. It just so happens to be a little more efficient at learning and remembering than humans can be at times.

This does not stop you from banning AIs from viewing all of your future works. Communicating that fact with all who interact with your works is probably going to make you a pretty unpopular person. However; rightsholders do not hold or reserve the right to revoke rights that they have previously given. Once that genie is out of the bottle; it’s out…unless you’ve got firm enough contract proof to show that someone agreed to otherwise handle the management of rights.

In some cases; that proof exists. Good luck in court. In most cases however; that proof does not exist in a manner that is solid enough to please the court. A lot of the time; we tend to exchange, transfer and reserve rights ephemerally…that is in a manner that is not strictly always 100% recognized by the law.

Gee; Perhaps we should change that; and encourage the reasonable adaptation and growth of Copyright to fairly address the challenges of the information age.

joe_vinegar,

This is a very nice and thorough comment! Can you provide a reputable source for these points? (no criticism intended: as you seem knowledgeable, I’d trust you could have such reputable sources already selected and at hand, that’s why I’m asking).

throwsbooks,

Not the poster you’re replying to, but I’m assuming you’re looking for some sort of source that neural networks generate stuff, rather than plagiarize?

Google scholar is a good place to start. You’d need a general understanding of how NNs work, but it ends up leading to papers like this one, which I picked out because it has neat pictures as examples. arxiv.org/abs/1611.02200

What this one is doing is taking an input in the form of a face, and turning it into a cartoon. They call it an emoji, cause it’s based on that style, but it’s the same principle as how AI art is generated. Learn a style, then take a prompt (image or text) and do something with the prompt in the style.

ParsnipWitch,

Current AI models do not learn the way human brains do. And the way current models learn how do “make art” is very different from how human artists do it. To repeatedly try and recreate the work of other artists is something beginners do. And posting these works online was always shunned in artist communities. You also don’t learn to draw a hand by remembering where a thousand different artists put the lines so it looks like a hand.

shiri, (edited )

Edit: I made the classic blunder of US centrism here, my bad

@fwygon all questions of how AI learns aside, it's not legally theft but philosophically the topic is debatable and very hot button.

I can however comment pretty well on your copyright comments which are halfway there, but have a lot of popular inaccuracies.

Fair use is a very vague topic, and they explicitly chose to not make explicit terms on what is allowed but rather the intents of what is to be allowed. We've got some firm ones not because of specific laws but from abundance of case evidence.

  • Educational; so long as it is taught as a part of a recognized class and within curriculum.
  • Informational; so long as it is being distributed to inform the public about valid, reasonable public interests. This is far broader than some would like; but it is legal.
  • Narrative or Commentary purposes; so long as you're not copying a significant amount of the whole content and passing it off as your own. Short clips with narration and lots of commentary interwoven between them is typically protected. Copyright is not intended to be used to silence free speech. This also tends to include satire; as long as it doesn't tread into defamation territory.

These are basically all the same category and includes some misinformation about what it does and does not cover. It's permitted to make copies for purely informational, public interest (ie. journalistic) purposes. This would include things like showing a clip of a movie or a trailer to make commentary on it.

Education doesn't get any special treatment here, but research might (ie. making copies that are kept to a restricted environment, and only used for research purposes, this is largely the protection that AI models currently fall under because the training data uses copyrighted data but the resulting model does not).

  • Transformative; so long as the content is being modified in a substantial enough manner that it is an entirely new work that is not easily confused for the original. This too, is far broader than some would like; but it still is legal.

"Easily confused" is a rule from Trademark Law, not copyright. Copyright doesn't care about consumer confusion, but does care about substitution. That is, if the content could be a substitute for the original (ie. copying someone else's specific painting is going to be a violation up until the point where it can only be described as "inspired by" the painting)

  • Reasonable, 'Non-Profit Seeking or Motivated' Personal Use; People are generally allowed to share things amongst themselves and their friends and other acquaintances. Reasonable backup copies, loaning of copies, and even reproduction and presentation of things are generally considered fair use.

This is a very very common myth that gets a lot of people in trouble. Copyright doesn't care about whether you profit from it, more about potential lost profits.

Loaning is completely disconnected from copyright because no copies are being made ("digital loaning" is a nonsense attempt to claiming loaning, but is just "temporary" copying which is a violation).

Personal copies are permitted so long as you keep the original copy (or the original copy is explicitly irrecoverably lost or destroyed) as you already acquired it and multiple copies largely are just backups or conversions to different formats. The basic gist is that you are free to make copies so long as you don't give any of them to anyone else (if you copy a DVD and give either the original or copy to a friend, even as a loan, it's illegal).

It's not good to rely on it being "non-profit" as a copyright excuse, as that's more just an area of leniency than a hard line. People far too often thing that allows them to get away with copying things, it's really just for topics like making backups of your movies or copying your CDs to mp3s.

... All that said, fun fact: AI works are not covered by copyright law.

To be copyrighted a human being must actively create the work. You can copyright things made with AI art, but not the AI art itself (ie. a comic book made with AI art is copyrighted, but the AI art in the panels is not, functioning much like if you made a comic book out of public domain images). Prompts and set up are not considered enough to allow for copyright (example case was a monkey picking up a camera and taking pictures, those pictures were deemed unable to be copyrighted because despite the photographer placing the camera... it was the monkey taking the photos).

Harrison,

This is true in US law but it should probably be noted that a lot of the “misconceptions” you’re outlining in OP’s comment are things that are legal in other jurisdictions

shiri,

@Harrison ::face palm:: thank you for calling that out, I'm so used to correcting fellow americans on copyright

Thevenin,

It doesn’t change anything you said about copyright law, but current-gen AI is absolutely not “a virtual brain” that creates “art in the same rough and inexact way that we humans do it.” What you are describing is called Artificial General Intelligence, and it simply does not exist yet.

Today’s large language models (like ChatGPT) and diffusion models (like Stable Diffusion) are statistics machines. They copy down a huge amount of example material, process it, and use it to calculate the most statistically probable next word (or pixel), with a little noise thrown in so they don’t make the same thing twice. This is why ChatGPT is so bad at math and Stable Diffusion is so bad at counting fingers – they are not making any rational decisions about what they spit out. They’re not striving to make the correct answer. They’re just producing the most statistically average output given the input.

Current-gen AI isn’t just viewing art, it’s storing a digital copy of it on a hard drive. It doesn’t create, it interpolates. In order to imitate a person’t style, it must make a copy of that person’s work; describing the style in words is insufficient. If human artists (and by extension, art teachers) lose their jobs, AI training sets stagnate, and everything they produce becomes repetitive and derivative.

None of this matters to copyright law, but it matters to how we as a society respond. We do not want art itself to become a lost art.

Zyansheep,
  1. How do you know human brains don’t work in roughly the same way chatbots and image generators work?
  2. What is art? And what does it mean for it to become “lost”?
gianni,

He literally just explained why.

Zyansheep,

No, he just said AI isn’t like human brains because its a “statistical machine”. What I’m asking is how he knows that human brains aren’t statistical machines?

Human brains aren’t that good at direct math calculation either!

Also he definitely didn’t explain what “lost art” is.

Fauxreigner,

Current-gen AI isn’t just viewing art, it’s storing a digital copy of it on a hard drive.

This is factually untrue. For example, Stable Diffusion models are in the range of 2GB to 8GB, trained on a set of 5.85 billion images. If it was storing the images, that would allow approximately 1 byte for each image, and there are only 256 possibilities for a single byte. Images are downloaded as part of training the model, but they’re eventually “destroyed”; the model doesn’t contain them at all, and it doesn’t need to refer back to them to generate new images.

It’s absolutely true that the training process requires downloading and storing images, but the product of training is a model that doesn’t contain any of the original images.

None of that is to say that there is absolutely no valid copyright claim, but it seems like either option is pretty bad, long term. AI generated content is going to put a lot of people out of work and result in a lot of money for a few rich people, based off of the work of others who aren’t getting a cut. That’s bad.

But the converse, where we say that copyright is maintained even if a work is only stored as weights in a neural network is also pretty bad; you’re going to have a very hard time defining that in such a way that it doesn’t cover the way humans store information and integrate it to create new art. That’s also bad. I’m pretty sure that nobody who creates art wants to have to pay Disney a cut because one time you looked at some images they own.

The best you’re likely to do in that situation is say it’s ok if a human does it, but not a computer. But that still hits a lot of stumbling blocks around definitions, especially where computers are used to create art constantly. And if we ever hit the point where digital consciousness is possible, that adds a whole host of civil rights issues.

Thevenin,

It’s absolutely true that the training process requires downloading and storing images

This is the process I was referring to when I said it makes copies. We’re on the same page there.

I don’t know what the solution to the problem is, and I doubt I’m the right person to propose one. I don’t think copyright law applies here, but I’m certainly not arguing that copyright should be expanded to include the statistical matrices used in LLMs and DPMs. I suppose plagiarism law might apply for copying a specific style, but that’s not the argument I’m trying to make, either.

The argument I’m trying to make is that while it might be true that artificial minds should have the same rights as human minds, the LLMs and DPMs of today absolutely aren’t artificial minds. Allowing them to run amok as if they were is not just unfair to living artists… it could deal irreparable damage to our culture because those LLMs and DPMs of today cannot take up the mantle of the artists they hedge out or pass down their knowledge to the next generation.

Fauxreigner,

Thanks for clarifying. There are a lot of misconceptions about how this technology works, and I think it’s worth making sure that everyone in these thorny conversations has the right information.

I completely agree with your larger point about culture; to the best of my knowledge we haven’t seen any real ability to innovate, because the current models are built to replicate the form and structure of what they’ve seen before. They’re getting extremely good at combining those elements, but they can’t really create anything new without a person involved. There’s a risk of significant stagnation if we leave art to the machines, especially since we’re already seeing issues with new models including the output of existing models in their training data. I don’t know how likely that is; I think it’s much more likely that we see these tools used to replace humans for more mundane, “boring” tasks, not really creative work.

And you’re absolutely right that these are not artificial minds; the language models remind me of a quote from David Langford in his short story Answering Machine: “It’s so very hard to realize something that talks is not intelligent.” But we are getting to the point where the question of “how will we know” isn’t purely theoretical anymore.

raccoona_nongrata,
@raccoona_nongrata@beehaw.org avatar

deleted_by_author

  • Loading...
  • selzero,
    @selzero@syzito.xyz avatar

    @raccoona_nongrata @fwygon

    Rutowski, Monet, and Rockwell could also not create without human art.

    All creativity is a combination of past creativity.

    Even Monet.

    Even Shakespeare.

    Even Beethoven.

    glenatron,
    @glenatron@dice.camp avatar

    @selzero @raccoona_nongrata @fwygon But human creativity is not ONLY a combination of past creativity. It is filtered through a lifetime of subjective experience and combined knowledge. Two human artists schooled on the same art history can still produce radically different art. Humans are capable of going beyond has been done before.

    Before going too deep on AI creation spend some time learning about being human. After that, if you still find statistical averages interesting, go back to AI.

    selzero,
    @selzero@syzito.xyz avatar

    @glenatron @raccoona_nongrata @fwygon

    I mean, yes, you are right, but essentially, it is all external factors. They can be lived through external factors, or data fed external factors.

    I don't think there is a disagreement here other than you are placing a lot of value on "the human experience" being an in real life thing rather than a read thing. Which is not even fully true of the great masters. It's a form of puritan fetishisation I guess.

    glenatron,
    @glenatron@dice.camp avatar

    @selzero @raccoona_nongrata @fwygon I don't think it's even contraversial. Will sentient machines ever have an equivalent experience? Very probably. Will they be capable of creating art? Absolutely.

    Can our current statistical bulk reincorporation tools make any creative leap? Absolutely not. They are only capable of plagiarism. Will they become legitimate artistic tools? Perhaps, when the people around them start taking artists seriously instead of treating them with distain.

    selzero,
    @selzero@syzito.xyz avatar

    @glenatron @raccoona_nongrata @fwygon

    This angle is very similar to a debate going on in the cinema world, with Scorsese famously ranting that Marvel movies are "not movies"

    The point being without a directors message being portrayed, these cookie cutter cinema experiences, with algorithmically developed story lines, should not be classified as proper movies.

    But the fact remains, we consume them as movies.

    We consume AI art as art.

    glenatron,
    @glenatron@dice.camp avatar

    @selzero @raccoona_nongrata @fwygon I try not to consume it as art. There is plenty of original art by real artists. The averages of that dataset are less interesting to me than the original data points.

    aredridel,
    @aredridel@kolektiva.social avatar

    @selzero @glenatron @raccoona_nongrata @fwygon And thousands of people's creativity is in the Marvel movie, but one person hammering out a prompt on the AI art. They're still vastly different. Even the most banally corporate movie is still a work of staggering human creativity and working together.

    Stable diffusion image generators are not.

    selzero,
    @selzero@syzito.xyz avatar

    @aredridel @glenatron @raccoona_nongrata @fwygon

    Humans are also machines, biological machines, with a neurology based on neurons and synapse. As pointed out before, human "creativity" is also a result of past external consumption.

    When AI is used to eventually make a movie, it will use more than one AI model. Does that make a difference? I guess your "one person" example is Scorsese's "auteur"?

    It seems we are fetishizing biological machines over silicon machines?

    aredridel,
    @aredridel@kolektiva.social avatar

    @selzero @glenatron @raccoona_nongrata @fwygon no. Human relationships of cocreation over purely extractive ones. It’s not the biology (though humans have human relevant social drives simple algorithms don’t), it’s the relationships.

    It’s obscuring that as if these clusters of Gpus care about creating and form relationships based on them that is so offensive.

    selzero,
    @selzero@syzito.xyz avatar

    @aredridel @glenatron @raccoona_nongrata @fwygon

    I don't understand, can you elaborate please. How is it not biological?

    aredridel,
    @aredridel@kolektiva.social avatar

    @selzero @glenatron @raccoona_nongrata @fwygon it’s biological the way zoology is physics. Technically true but so deeply ignorant of the orders of magnitude of history and emergent complexity for that also to not be relevant. It’s a profoundly reductive way to look at things to the point of missing their fundamental nature.

    selzero,
    @selzero@syzito.xyz avatar

    @aredridel @glenatron @raccoona_nongrata @fwygon

    So, a human being a link in the chain of this historical cultural development of creation, is "more valuable" than a machine doing that?

    Who makes these rules?

    There is some kind of value structure at play here that I have not been made privy to?

    doug,
    @doug@union.place avatar

    deleted_by_author

  • Loading...
  • selzero,
    @selzero@syzito.xyz avatar

    @doug @aredridel @glenatron @raccoona_nongrata @fwygon

    So Doug what you are saying is one of these things takes in external data, processes it, synergies it, and exports a derivative version, and the other thing is the machine?

    No wait, the other thing is the human?

    ... Wait...

    selzero,
    @selzero@syzito.xyz avatar

    @doug @aredridel @glenatron @raccoona_nongrata @fwygon

    Imagination IS the processing of retained information to create a derivative.

    aredridel,
    @aredridel@kolektiva.social avatar

    @selzero @doug @glenatron @raccoona_nongrata @fwygon not just: it’s about relationships. Nearly all art is social.

    doug,
    @doug@union.place avatar

    deleted_by_author

  • Loading...
  • selzero,
    @selzero@syzito.xyz avatar

    @doug

    Not a belief, my understanding of neurology and the psychology of creativity.

    Our perception of "an original thought" doesn't come into our mind through some alien ray introducing ideas.

    It is the culmination of our experiences, consumed ideas, exposure to other concepts, etc etc that we form and shape, and create new different instances.

    There is nothing metaphysical or mystical happening here.

    doug,
    @doug@union.place avatar

    deleted_by_author

  • Loading...
  • selzero,
    @selzero@syzito.xyz avatar

    @doug

    Ok so let me give an example.

    I'm making a game. Let's say it's a Bomberman clone. My artists have made a bunch of ground and block tiles. They worked a whole week and we have 5 levels.

    We then use AI to create tile maps that are deviations from that 5, and get to 1,000 levels in an afternoon.

    The game benefits hugely from the deviations.

    Why is this in any way bad, immoral, or rude?

    aredridel,
    @aredridel@kolektiva.social avatar

    @selzero @doug because a few billionaires exploited the work of millions to do it, unpaid and without reciprocal relationships.

    selzero,
    @selzero@syzito.xyz avatar

    @aredridel @doug

    Not true.

    Who is being exploited?

    Also, I am by no means a Billionaire.

    I'm a Marxist FFS.

    aredridel,
    @aredridel@kolektiva.social avatar

    @selzero @doug every place listing anything online is being relentlessly scraped as “content”. Artists are having their styles replicated without the relationships that led to it being supported. It’s literally a bunch of corporations taking from us. I literally need to delete my Twitter posts right now because they’re gonna get pumped into yet another billionaires pet model project.

    They are literally exploiting all of us for corporate gain. As always, and this is a nice tasty unregulated spot.

    selzero,
    @selzero@syzito.xyz avatar

    @aredridel @doug

    I am not using any of that.

    I'm using data sets specifically licenced for machine learning training.

    Usually by merging data sets made public by academic projects.

    Or I make my own.

    The sample set is smaller but there are techniques that I use to circumvent that, by creating in between objects from the available objects.

    It is entirely plausible to create ML models ethically. I do it all the time. This is not the point.

    aredridel,
    @aredridel@kolektiva.social avatar

    @selzero @doug yeah! Not all model building is unethical! But gpt4 and llama and so many are!

    That’s the thing. And in a competitive system, we see the “best” set expectations and this stuff stay niche.

    But at this point it’s not a machine creating art: it’s people. It’s a tool. And it’s one changing the relationships between humans in some uncomfortable ways even without.

    But that’s much better!

    selzero,
    @selzero@syzito.xyz avatar

    @aredridel @doug

    It definitely IS a tool!!

    And we can use the tool!

    Just because some dickhead wrote Mein Kampf doesn't mean we should stop using books.

    So, we are agreed. Machine Learning shouldn't be unethically sourced. No one is debating that.

    What is your problem with people using it once it is ethically sourced?

    aredridel,
    @aredridel@kolektiva.social avatar

    @selzero @doug depends on what people do with it and what relationships they disrupt too. And honestly needs to be 100% renewable energy too imo.

    And then credit given. Can the model list the influences used?

    This is an open question and we need to confront our existing bad handling of that because models amplify the amount of derivation we can do and the relationship when we do it.

    And then what are we doing to art itself? Are we now privileging inspiration under an academic training friendly license? That has big effects too, analogous to how the BSD licensed software plus corporate profiteering creates some nasty externalities for the software industry. (Whole lines of business are unviable because of it, but the people enabling it don’t get paid often. Legally, but not ethically)

    selzero,
    @selzero@syzito.xyz avatar

    @aredridel @doug

    All of these are valid arguments that should be discussed and formulated.

    I'm happy to do so, and I agree there are pitfalls to navigate. There are points I'll have to conceed.

    As long as we are not outright raging against any use of AI. It is a really powerful tool, and is already vastly benefiting mankind.

    Right now I have to get back to work because this code won't complete itself... Yet.

    HAL from 2001

    aredridel,
    @aredridel@kolektiva.social avatar

    @selzero @doug yeah. I just think we need to really look at and be specific about it all.

    And I’m unconvinced of the benefit. I’m drowning in badly produced writing, most artists I know are being automated out of their meals, and new scams are everywhere. Power is unbalanced and capital has their hands on things.

    selzero,
    @selzero@syzito.xyz avatar

    @aredridel @doug

    As part of my AI Postgrad Diploma I created a model out of lung X-Rays that can detect disease with a single image, within seconds with negligible compute, which would make it possible for hospitals worldwide to replace a costly and resource intensive process, with a cloud upload of a single image. At the time I was just learning, and it took me a couple of weeks. Here in Cambridge there is a whole medical tech sector developing similar every day. AI is already saving lives.

    PSiReN,
    PSiReN,
    selzero,
    @selzero@syzito.xyz avatar

    @aredridel @doug

    The point is you guys are against machines creating art assets.

    So in my Bomberman clone example, what is the problem if all the art used was created by the team making the game?

    Why can't I use machine learned deviations of it?

    It's the same as training an intern to do it.

    But 1000x faster and the intern can learn something better.

    doug,
    @doug@union.place avatar

    deleted_by_author

  • Loading...
  • aredridel,
    @aredridel@kolektiva.social avatar

    @doug @selzero absolutely! And it’s these distinctions that after and help surface the specific ethical issues.

    doug,
    @doug@union.place avatar

    deleted_by_author

  • Loading...
  • selzero,
    @selzero@syzito.xyz avatar

    @doug

    Not at all.

    AI is already productive to a high standard.

    The last product I sold that included AI models I delivered to DisneyLand (for the Avatar movie attraction) and last month won an award.

    It is being used everywhere.

    So, the point remains, what is wrong with using it to make tile sets for my Bomberman game?

    doug,
    @doug@union.place avatar

    deleted_by_author

  • Loading...
  • selzero,
    @selzero@syzito.xyz avatar

    @doug

    No I wouldn't. This is also why I don't use GitHub and similar code repositories that are used to train AI.

    It's also why I source my data ethically from sources that agree to allow ML training.

    But the ethics of data collection is another issue, let's not get side tracked.

    My point here is that there is no reason to be puritan over an asset being created by a neurology or a silicon chip.

    doug,
    @doug@union.place avatar

    deleted_by_author

  • Loading...
  • selzero,
    @selzero@syzito.xyz avatar

    @doug

    Again you are adding a narrative that quality suffers.

    It does not. 🤷

    doug,
    @doug@union.place avatar

    deleted_by_author

  • Loading...
  • selzero,
    @selzero@syzito.xyz avatar

    @doug

    Mario Kart WOULD be better with 1000 unique tracks.

    raccoona_nongrata,
    @raccoona_nongrata@beehaw.org avatar

    deleted_by_author

  • Loading...
  • selzero,
    @selzero@syzito.xyz avatar

    @raccoona_nongrata

    Actually. It is necessary. The process of creativity is much much more a synergy of past consumption than we think.

    It took 100,000 years to get from cave drawings to Leonard Da Vinci.

    Yes we always find ways to draw, but the pinnacle of art comes from a shared culture of centuries.

    raccoona_nongrata,
    @raccoona_nongrata@beehaw.org avatar

    deleted_by_author

  • Loading...
  • selzero,
    @selzero@syzito.xyz avatar

    @raccoona_nongrata

    A machine will not unilaterally develop an art form, and develop it for 100,000 years.

    Yes I agree with this.

    However, they are not developing an art form now.

    Nor did Monet, Shakespeare, or Beethoven develop an art form. Or develop it for 100,000 years.

    So machines cannot emulate that.

    But they can create the end product based on past creations, much as Monet, Shakespeare, and Beethoven did.

    ParsnipWitch,

    No, humans create and develope styles in art from “mistakes” that AI would not continue pursuing. Because they personally like it or have a strange addiction to their own creative process. The current hand mistakes for example were perhaps one of the few interesting things AI has done…

    Current AI models recreate what is most liked by the majority of people.

    I_Has_A_Hat,

    And what if the human running the AI likes one of these “mistakes” and tells the AI to run with it?

    ParsnipWitch,

    But that’s still not how it works for an artist. I don’t mean stumbling upon an accident and using it in your work but deliberately creating something that’s not liked and perfect the way you do it. For someone who just instructs a tool and generates images in rapid speed they go a very different path.

    raccoona_nongrata,
    @raccoona_nongrata@beehaw.org avatar

    deleted_by_author

  • Loading...
  • selzero,
    @selzero@syzito.xyz avatar

    @raccoona_nongrata

    Actually this is how we are training some models now.

    The models are separated, fed different versions of the source data, then we kick off a process of feeding them content that was created by the other models creating a loop. It has proven very effective. It is also the case that this generation of AI created content is the next generations training data, simply by existing. What you are saying is absolutely false. Generated content DOES have a lot of value as source data

    selzero,
    @selzero@syzito.xyz avatar

    @raccoona_nongrata

    In fact, generating content purely for the purpose of training itself is one of the core techniques in training machine learning models.

    housepanther,

    @raccoona_nongrata @fwygon This is absolutely correct!

    timo21,
    @timo21@mastodon.sdf.org avatar

    @raccoona_nongrata @fwygon your reply caused me to consider an image of humans stuffed in a room making art for AI to use. Then I realized we have those: art made in prisons and schools are ripe for AI to steal.

    AceFuzzLord,

    All this proves to me, based on the context from this post, is that people are willing to commit copyright infringement in order to make a machine produce art in a specific style.

    Hawk,

    It doesn’t say anywhere they used copyrighted art though?

    Seems the new model might use art inspired by him, not his art itself.

    It’s a moral gray zone. If you add enough freely available works inspired by someone, the model can produce a similar style without using any original works.

    Is it still copyright infringement at that point?

    AceFuzzLord,

    If it’s inspired then at that point I guess it might not be copyright infringing unless it’s an accurate enough recreation of a copyrighted piece… And it looks like my mind filled in the gaps to assume it was copyrighted work being used.

    UnknownCircle, (edited )
    UnknownCircle avatar

    Its unlikely that this did not use his work, these models require input data. Even if they took similar art, that would only resolve the issue of Greg himself but would shift it to those other artists. Unless there is some sort of unspoken artistic genealogical purity that prevents artists with similar or inspired styles from having equal claim on their own creations when inspired by another.

    It also could be outputs generated from another AI model. But I don't think people who see ethical problems in this care about the number of steps removed and processing that occurs when the origin is his artwork and it ultimately outputs the same or similar style. The result is what bothers people, no matter how disparate or disconnected the source's influence is. If the models had simply found the Greg Rutkowski latent space through random chance people would still take issue with it.

    The ability and willingness to generate images in a style associated with a person, without consent, is a threat to that persons job security and shows a lack of value for them as a human. As if their creative expression is worth nothing but as a commodity to be consumed.

    The people supporting this don't care though. They want to consume this person's style in far greater quantities and variations then a human is capable or willing to fulfill. That's why these debates are so fierce, the two sides have incentives that are in direct conflict with one another.

    We currently lack the economic ingenuity or willingness to create a system that will satisfy both parties. The barrier of entry to AI is low, someone at home has every incentive to maintain the status quo or even actively rail against artists. Artists will need a heavy handed approach from the government or as a collective to combat this effectively.

    KoboldCoterie,
    @KoboldCoterie@pawb.social avatar

    It also could be outputs generated from another AI model.

    This is an interesting point, and you get into some real Ship of Theseus territory. At what point is it no longer based on his work? How many iterations before he no longer has any claim to it at all?

    UnknownCircle,
    UnknownCircle avatar

    Its certainly interesting, but its ultimately going to be wherever we collectively decide.

    One thing modern ML advancements have made painfully clear is that something being the "same" is variable based on what definition you use to determine sameness. Is it the same crew, same look, same feel, same atoms, same purpose, same name, etc... In the absence of such definition, everything ceases to be the same the moment after it has been described. As every single thing is constantly changing.

    Living things naturally generalize similarities, relationships, and associations into patterns that are re-used and abstracted. So we very much take these things for granted.

    If you like that type of thing you may enjoy Funes the Memorious by Luis Borges

    Harrison,

    The ability and willingness to generate images in a style associated with a person, without consent, is a threat to that persons job security and shows a lack of value for them as a human. As if their creative expression is worth nothing but as a commodity to be consumed.

    You can’t own an art style. Copyright only extends to discrete works and characters. If I pay a street artist to draw a portrait of me in the style of Picasso, I’m not devaluing Picasso as a person.

    UnknownCircle,
    UnknownCircle avatar

    I agree that you can't own an art style in the US and I don't know if there's any other legal basis for artist's claims.

    Legality doesn't automatically deal with problems that are not based on whether something is legal or not. Losing money is losing money, regardless of if its the result of something legal. And people can feel devalued by something that is legal. It just means that the government will not use force to intervene in what you're doing and may in-fact use force to support you.

    Picasso is dead, so he has no ability to feel devalued. Artists who are alive do have that ability and other living people who value his works do as well.

    I myself support and love this technology. But it is clear that it causes problems for some people. I would prefer for it to exist in a form where artists could get value from and be happy with it too, but that is just not the case at present.

    SmoochyPit,

    If an image is represented as a network of weighted values describing subtle patterns in the image rather than a traditional grid of pixel color values, is that copy of the image still subject to copyright law?

    How much would you have to change before it isn’t? Or if you merged it with another representation, would that change your rights to that image?

    whelmer,

    It doesn’t matter how you recreate an image, if you recreate someone else’s work that is a violation of copyright.

    Stealing someone’s style is a different matter.

    KoboldCoterie,
    @KoboldCoterie@pawb.social avatar

    if you recreate someone else’s work that is a violation of copyright.

    Only if the work is copyrighted, and your copy does not constitute fair use…

    I could create a faithful reproduction of the Mona Lisa (or… I mean, someone could, I sure couldn’t), and it’s not violating copyright, because the Mona Lisa is not copyrighted.

    Crankpork,

    I could create a faithful reproduction of the Mona Lisa

    You could, but Stable Diffusion couldn’t. All it can do is output what it’s been fed. It doesn’t know composition, or colour theory. It doesn’t understand that something is a human, or a fabric, or how materials work, it just reproduces variations of what it’s been fed. Calling it “intelligence” is disingenuous: it doesn’t “know” anything, it just reproduces what’s built into it’s database, usually without the artist’s permission.

    raccoona_nongrata,
    @raccoona_nongrata@beehaw.org avatar

    deleted_by_author

  • Loading...
  • whelmer,

    Well said. Copyright is whatever, but the disrespect shown here is remarkable.

    FaceDeer,
    FaceDeer avatar

    Yeah, all these people yelling about how people who use AI art generators are "thieves" who are "stealing" art and that the things they generate are "not really art" and so forth. Very disrespectful.

    ParsnipWitch, (edited )

    We will probably all have to get used to this soon because I can see the same happening to authors, journalists and designers. Perhaps soon programmers, lawyers and all kinds of other people as well.

    It’s interesting how people on Lemmy pretend to be all against big corporations and capitalism and then they happily indulge in the process of making artists jobless becaus “Muh technology cool!”. I don’t know the English word to describe this situation. In German I would say “Tja…”

    FaceDeer,
    FaceDeer avatar

    Just as quickly as people disregard the human art enjoyer, who now has access to a powerful tool to create art undreamed of a year ago.

    I have found over the years that forums that claim to be about various forms of art are almost always really about the artists that make that art, and have little to no regard for the people who are there just for the art itself. The AI art thing is just the latest and most prominent way of revealing this.

    trashhalo,

    Re: Stolen. Not stolen comments Copyright law as interpreted judges is still being worked out on AI. Stay tuned if it’s defined as stolen or not. But even if the courts decide existing copyright law would define training on artists work as legitimate use. The law can change and it still could swing the way of the artist if congress got involved.


    My personal opinion, which may not reflect what happens legally is I hope we all get more control over our data and how it’s used and sold. Wether that’s my personal data like my comments, location or my artistic data like my paintings. I think that would be a better world

    FaceDeer,
    FaceDeer avatar

    Copyright law as interpreted judges is still being worked out on AI. Stay tuned if it’s defined as stolen or not.

    You just contradicted yourself in two sentences. Copyright and theft are not the same thing. They are unrelated to each other. When you violate copyright you are not "stealing" anything. This art is not "stolen", full stop.

    MJBrune,

    The “nothing of value was lost when you pirate” argument. I’m a game developer who fully encourages people to pirate my games (or email me if they can’t afford my games and want a free Steam key) but I can tell you value is lost when people pirate content. Even if that’s simply a positive Steam review which in turn will put you higher up on placements on Steam’s algorithm which will gain you more sales. Something of value is lost when you pirate. It’s on the artist to determine if that value is acceptable to be lost. If they made their art for the sake of humanity or if they made art for the sake of survival in our shitty capitalistic society.

    So sorry, yes, something is lost and it’s because of capitalism. I’d argue otherwise if it didn’t mean someone didn’t get to eat or pay rent. I pirated a lot of media back in my day when I couldn’t afford that media. I used to tell myself I wouldn’t have likely bought those things anyways. That I wasn’t taking from someone. In reality, I would have waited for a sale and gotten that media for 5 dollars. 5 dollars is still a lot of money when selling something though. If I just gave you 5 dollars you could do something small but nice for yourself. You could go buy a lot of things with that sale money. Just because you aren’t spending 60 dollars on it doesn’t mean you would never buy it. The fact that you want to play it says you’d probably buy it. Maybe you’d refund it. Maybe you wouldn’t. Your time is worth something to you though. Thus when you pirate something you are committing something of value from yourself to search, download and ingest that media.

    So how does this deal with copyright theft? Stealing something and using it devalues the original product. You’ve seen it a dozen times for better or worse. Minecraft is a great example of how it got devalued for a while there when everyone made Minecraft clones. My kid told me the other day that he got Minecraft on his tablet for free. It was some terrible knockoff he had been playing. I explained this and asked if he wanted the real thing. He said yes and I went and bought Minecraft. That in itself is proof that value is being lost by even legally taking an idea and copying it. A kid’s parent who didn’t know better would have just been like “Hmm, that’s great, have fun.” The best point I can make is that if there was one video game ever, to play a video game you would have to buy that one game. That one game would have more sales than any single game out there today. Clearly, something of value is being created by the exclusivity of copyright.

    There is, of course, a balance. What is copyrightable? What stifles creativity and innovation? I would say if these AI artists were able to recreate the style from prompts and only train the AI on images that it has the authority to distribute (public domain images, CC0, etc.) then it’s fair game. Training AI on copyrighted materials and then distributing derived works is copyright theft and should be deemed as such.

    storksforlegs,
    @storksforlegs@beehaw.org avatar

    Copying art for personal, non-commercial use is not theft, but copying someone’s art and then profiting (using their image without permission to enrich yourself) is theft.

    FaceDeer,
    FaceDeer avatar

    No.

    • Copying someone's art without permission is copyright violation, not theft.
    • These AIs aren't copying anyone's art, so it's not even copyright violation.
    whelmer,

    That’s your opinion. The contrary opinion would be that copyright infringement is the theft of intellectual property, which many people view as of equal substantiality to physical property.

    You can disagree with the concept of intellectual property but clearly there’s an alternative to your point of view that you can’t just dismiss by declaration.

    FaceDeer,
    FaceDeer avatar

    Take your opinion to a court of law and see how far it gets. They actually pay close attention to what words mean there. If copyright violation was theft why do they have two different sets of laws to deal with them?

    whelmer,

    I’m sure you’re aware that the manner in which legal bureaucracies define terms is a form of jargon that differentiates legal language from actual language.

    They have separate categories of laws to deal with them because physical property is different than intellectual property. The same reason they use a different category of law to deal with identity theft.

    trashhalo,

    “Is copyright infringement theft” is something that had been debated for as long as mp3s were a thing. This is an old argument with lots of material on both sides scattered across the web. I clearly fall on the side of copyright infringement is theft and theft is stealing.

    amju_wolf,
    @amju_wolf@pawb.social avatar

    There’s absolutely no debate, legal or otherwise.

    Theft, by definition, requires you to deprive someone of something. That simply cannot happen when you copy stuff. That’s why it’s called copyright infringement and not theft.

    You can only steal art by physically stealing an art piece - then and only then it’s theft.

    trashhalo,

    😀 just want to note theres a call out to this debate in the Wikipedia page on copyright infringement. I promise I didn’t add that paragraph there. en.wikipedia.org/wiki/Copyright_infringement

    https://beehaw.org/pictrs/image/6d22e6c4-06d1-4fad-a6f1-04ce9e0605a1.webp

    whelmer, (edited )

    What do you mean there is no debate? You’re debating it right now.

    Plenty of artists view it as theft when people take their work and use it for their own ends without their permission. Not everyone, sure. But it’s a bit odd to state so emphatically that there is no debate.

    Brave_heart,

    @trashhalo
    Тест

  • All
  • Subscribed
  • Moderated
  • Favorites
  • technology@beehaw.org
  • DreamBathrooms
  • mdbf
  • InstantRegret
  • Durango
  • Youngstown
  • rosin
  • slotface
  • thenastyranch
  • osvaldo12
  • ngwrru68w68
  • kavyap
  • cisconetworking
  • khanakhh
  • magazineikmin
  • anitta
  • cubers
  • vwfavf
  • modclub
  • everett
  • ethstaker
  • normalnudes
  • tacticalgear
  • tester
  • provamag3
  • GTA5RPClips
  • Leos
  • megavids
  • JUstTest
  • All magazines