decrypt.co

doeknius_gloek, to technology in Greg Rutkowski Was Removed From Stable Diffusion, But AI Artists Brought Him Back - Decrypt

While some argue this is unethical, others justify it since Rutkowski’s art has already been widely used in Stable Diffusion 1.5.

What kind of argument is that supposed to be? We’ve stolen his art before so it’s fine? Dickheads. This whole AI thing is already sketchy enough, at least respect the artists that explicitly want their art to be excluded.

Otome-chan,
Otome-chan avatar

no one's art is being "stolen". you're mistaken.

Crankpork,

Aside from all the artists whose work was fed into the AI learning models without their permission. That art has been stolen, and is still being stolen. In this case very explicitly, because they outright removed his work, and then put it back when nobody was looking.

I_Has_A_Hat,

Let me give you a hypothetical that’s close to reality. Say an artist gets very popular, but doesn’t want their art used to teach AI. Let’s even say there’s even legislation that prevents all this artist’s work from being used in AI.

Now what if someone else hires a bunch of cheap human artists to produce works in a style similar to the original artist, and then uses those works to feed the AI model? Would that still be stolen art? And if so, why? And if not, what is this extra degree of separation changing? The original artist is still not getting paid and the AI is still producing works based on their style.

wizardbeard,
@wizardbeard@lemmy.dbzer0.com avatar

Fine, you win the semantic argument about the use of the term “stealing”. Despite arguments about word choice, this is still a massively disrespectful and malicious action against the artist.

Crankpork,

So you hire people to trace the original art, that’s still copying it, and nobody is learning anything. It’s copying.

Harrison,

They didn’t say trace. A good artist can use the style of another artist when creating a new work.

Crankpork,

Yeah but a computer can’t, no matter how much people want to believe it can. Not with current tech.

Crankpork,

Comic book artists get in shit for tracing other peoples’ work all the time. Look up Greg Land. It’s shitty regardless of whether it’s a person doing it directly, or if someone built software to do it for them.

CallumWells,

Strictly speaking it wouldn’t exactly be stealing, but I would still consider it as about equal to it, especially with regards to economic benefits. It may not be producing exact copies (which strictly speaking isn’t stealing, but is violating copyright) or actually stealing, but it’s exploiting the style that most people would assume mean that that specific artist made it and thus depriving that artist from benefiting from people wanting art from that artist/in that style.

Now, I’m not conflicted about people who have made millions off their art having people make imitations or copies, those people live more than comfortably enough. But in your example there are still other human artists benefiting, which is not the case for computationally generated works. It’s great for me to be able to have computers create art for a DnD campaign or something, but I still recognize that it’s making it harder for artists to earn a living from their skills. And to a certain degree it makes it so people who never would have had any such art now can. It’s in many ways like piracy with the same ethical framing. And as with piracy it may be that people that use AI to make them art become greater “consumers” of art made by humans as well, paying it forward. But it may also not work exactly that way.

Otome-chan,
Otome-chan avatar

People aren't allowed to produce similar styles to other humans? So do you support disney preventing anyone from making cartoons?

CallumWells,

Now you’re making a strawman. Other humans that are actually making art generally don’t fully copy a specific style, they draw inspiration from different sources and that amalgamation is their style.

Your comment reads as bad-faith to me. If it wasn’t meant as such you’re free to explain your stance properly instead of making strawman arguments.

grue,

That’s true, but only in the sense that theft and copyright infringement are fundamentally different things.

Generating stuff from ML training datasets that included works without permissive licenses is copyright infringement though, just as much as simply copying and pasting parts of those works in would be. The legal definition of a derivative work doesn’t care about the techological details.

(For me, the most important consequence of this sort of argument is that everything produced by Github Copilot must be GPL.)

rikudou,

That’s incorrect in my opinion. AI learns patterns from its training data. So do humans, by the way. It’s not copy-pasting parts of image or code.

grue,

By the same token, a human can easily be deemed to have infringed copyright even without cutting and pasting, if the result is excessively inspired by some other existing work.

Crankpork,

AI doesn’t “learn” anything, it’s not even intelligent. If you show a human artwork of a person they’ll be able to recognize that they’re looking at a human, how their limbs and expression works, what they’re wearing, the materials, how gravity should affect it all, etc. AI doesn’t and can’t know any of that, it just predicts how things should look based on images that have been put in it’s database. It’s a fancy Xerox.

rikudou,

Why do people who have no idea how some thing works feel the urge to comment on its working? It’s not just AI, it’s pretty much everything.

AI does learn, that’s the whole shtick and that’s why it’s so good at stuff computers used to suck at. AI is pretty much just a buzzword, the correct abbreviation is ML which stands for Machine Learning - it’s even in the name.

AI also recognizes it looks at a human! It can also recognize what they’re wearing, the material. AI is also better in many, many things than humans are. It also sucks compared to humans in many other things.

No images are in its database, you fancy Xerox.

Crankpork,

And I wish that people who didn’t understand the need for the human element in creative endeavours would focus their energy on automating things that should be automated, like busywork, and dangerous jobs.

If the prediction model actually “learned” anything, they wouldn’t have needed to add the artist’s work back after removing it. They had to, because it doesn’t learn anything, it copies the data it’s been fed.

rikudou,

Just because you repeat the same thing over and over it doesn’t become truth. You should be the one to learn, before you talk. This conversation is over for me, I’m not paid to convince people who behave like children of how things they’re scared of work.

MJBrune,

At the heart of copyright law is the intent. If an artist makes something, someone can’t just come along and copy it and resell it. The intent is so that artists can make a living for their innovation.

AI training on copyrighted images and then reproducing works derived from those images in order to compete with those images in the same style breaks the intent of copyright law. Equally, it does not matter if a picture is original. If you take an artist’s picture and recreate it with pixel art, there have already been cases where copyright infringement settlements have been made in favor of the original artist. Despite the original picture not being used at all, just studied. Mile’s David Kind Of Bloop cover art.

grue,

You’re correct in your description of what a derivative work is, but this part is mistaken:

The intent is so that artists can make a living for their innovation.

The intent is “to promote the progress of science and the useful arts” so that, in the long run, the Public Domain is enriched with more works than would otherwise exist if no incentive were given. Allowing artists to make a living is nothing more than a means to that end.

MJBrune,

It promotes progress by giving people the ability to make the works. If they can’t make a living off of making the works then they aren’t going to do it as a job. Thus yes, the intent is so that artists can make a living off of their work so that more artists have the ability to make the art. It’s really that simple. The intent is so that more people can do it. It’s not a means to the end, it’s the entire point of it. Otherwise, you’d just have hobbyists contributing.

whelmer,

I like what you’re saying so I’m not trying to be argumentative, but to be clear copyright protections don’t simply protect those who make a living from their productions. You are protected by them regardless of whether you intend to make any money off your work and that protection is automatic. Just to expand upon what @grue was saying.

Otome-chan,
Otome-chan avatar

It's actually not copyright infringement at all.

Edit: and even if it was, copyright infringement is a moral right, it's a good thing. copyright is theft.

grue,

Edit: …copyright infringement is a moral right, it’s a good thing. copyright is theft.

Except when it’s being used to enforce copyleft.

MJBrune,

It’s likely copyright infringement but that’s for the courts to decide, not you or me. Additionally, “copyright infringement is a moral right” seems fairly wrong. Copyright laws currently are too steep and I can agree with that but if I make a piece of art like a book, video game, or movie, do I not deserve to protect it in order to get money? I’d argue that because we live in a capitalistic society so, yes, I deserve to get paid for the work I did. If we lived in a better society that met the basic needs (or even complex needs) of every human then I can see copyright laws being useless.

At the end of the day, the artists just want to be able to afford to eat, play games, and have shelter. Why in the world is that a bad thing in our current society? You can’t remove copyright law without first removing capitalism.

grue,

Additionally, “copyright infringement is a moral right” seems fairly wrong. Copyright laws currently are too steep and I can agree with that but if I make a piece of art like a book, video game, or movie, do I not deserve to protect it in order to get money? I’d argue that because we live in a capitalistic society so, yes, I deserve to get paid for the work I did.

No. And it’s not just me saying that; the folks who wrote the Copyright Clause (James Madison and Thomas Jefferson) would disagree with you, too.

The natural state of a creative work is for it to be part of a Public Domain. Ideas are fundamentally different from property in the sense that property’s value comes from its exclusive use by its owner, wheras an idea’s value comes from spreading it, i.e., giving it away to others.

Here’s how Jefferson described it:

stable ownership is the gift of social law, and is given late in the progress of society. it would be curious then if an idea, the fugitive fermentation of an individual brain, could, of natural right, be claimed in exclusive and stable property. if nature has made any one thing less susceptible, than all others, of exclusive property, it is the action of the thinking power called an Idea; which an individual may exclusively possess as long as he keeps it to himself; but the moment it is divulged, it forces itself into the possession of every one, and the reciever cannot dispossess himself of it. it’s peculiar character too is that no one possesses the less, because every other possesses the whole of it. he who recieves an idea from me, recieves instruction himself, without lessening mine; as he who lights his taper at mine, recieves light without darkening me. that ideas should freely spread from one to another over the globe, for the moral and mutual instruction of man, and improvement of his condition, seems to have been peculiarly and benvolently designed by nature, when she made them, like fire, expansible over all space, without lessening their density in any point; and like the air in which we breathe, move, and have our physical being, incapable of confinement, or exclusive appropriation. inventions then cannot in nature be a subject of property. society may give an exclusive right to the profits arising from them as an encouragement to men to pursue ideas which may produce utility. but this may, or may not be done, according to the will and convenience of the society, without claim or complaint from any body.

Thus we see the basis for the rationale given in the Copyright Clause itself: “to promote the progress of science and the useful arts,” which is very different from creating some kind of entitlement to creators because they “deserve” it.

The true basis for copyright law in the United States is as a utilitarian incentive to encourage the creation of more works - a bounty for creating. Ownership of property is a natural right which the Constitution pledges to protect (see also the 4th and 5th Amendments), but the temporary monopoly called copyright is merely a privilege granted at the pleasure of Congress. Essentially, it’s a lease from the Public Domain, for the benefit of the Public. It is not an entitlement; what the creator of the work “deserves” doesn’t enter into it.

And if the copyright holder abuses his privilege such that the Public no longer benefits enough to be worth it, it’s perfectly just and reasonable for the privilege to be revoked.

At the end of the day, the artists just want to be able to afford to eat, play games, and have shelter. Why in the world is that a bad thing in our current society? You can’t remove copyright law without first removing capitalism.

This is a bizarre, backwards argument. First of all, a government-granted monopoly is the antethesis of the “free market” upon which capitalism is supposedly based. Second, granting of monopolies is hardly the only way to accomplish either goal of “promoting the progress of science and the useful arts” or of helping creators make a living!

MJBrune,

Thus we see the basis for the rationale given in the Copyright Clause itself: “to promote the progress of science and the useful arts,” which is very different from creating some kind of entitlement to creators because they “deserve” it.

… You realize the reason it promotes progress is because it allows the creators to get paid for it, right? It’s not “they deserve it” it’s “they need to eat and thus they aren’t going to do it unless they make money.” Which is exactly my argument.

Ownership of property is a natural right which the Constitution pledges to protect (see also the 4th and 5th Amendments), but the temporary monopoly called copyright is merely a privilege granted at the pleasure of Congress

It’s a silly way to put that since the “privilege granted” is given in to Congress in the Constitution.

Overall though, you are referencing a 300-year-old document like it means something. The point comes down to people needing to eat in a capitalistic society.

This is a bizarre, backwards argument. First of all, a government-granted monopoly is the antethesis of the “free market” upon which capitalism is supposedly based.

Capitalism isn’t really based on a free market and never has been in practice.

Second, granting of monopolies is hardly the only way to accomplish either goal of “promoting the progress of science and the useful arts” or of helping creators make a living!

Sure but first enact those changes then try to change or break copyright. Don’t take away the only current way for artists to make money then say “Well, the system should be different.” You are causing people to starve at that point.

FaceDeer,
FaceDeer avatar

His art was not "stolen." That's not an accurate word to describe this process with.

It's not so much that "it was done before so it's fine now" as "it's a well-understood part of many peoples' workflows" that can be used to justify it. As well as the view that there was nothing wrong with doing it the first time, so what's wrong with doing it a second time?

Kara, (edited )
Kara avatar

I don't like when people say "AI just traces/photobashes art." Because that simply isn't what happens.

But I do very much wish there was some sort of opt-out process, but ultimately any attempt at that just wouldn't work

chemical_cutthroat,
chemical_cutthroat avatar

People that say that have never used AI art generation apps and are only regurgitating what they hear from other people who are doing the same. The amount of arm chair AI denialists is astronomical.

ricecake,

There’s nothing stopping someone from licensing their art in a fashion that prohibits their use in that fashion.
No one has created that license that I know of, but there are software licenses that do similar things, so it’s hardly an unprecedented notion.

The fact of the matter is that before people didn’t think it was necessary to have specific usage licenses attached to art because no one got funny feelings from people creating derivative works from them.

Zeus,

pirating photoshop is a well-understood part of many peoples’ workflows. that doesn’t make it legal or condoned by adobe

FaceDeer,
FaceDeer avatar

I don't know what this has to do with anything. Nothing was "pirated", either.

Backspacecentury,

Was he paid for his art to be included?

Kichae,

His work was used in a publicly available product without license or compensation. Including his work in the training dataset was, to the online vernacular use of the word, piracy.

They violated his copyright when they used his work to make their shit.

FaceDeer,
FaceDeer avatar

The product does not contain his work. So no copying was done, therefore no "piracy."

Zeus,

i’m not making a moral comment on anything, including piracy. i’m saying “but it’s part of my established workflow” is not an excuse for something morally wrong.

only click here if you understand analogy and hyperboleif i say “i can’t write without kicking a few babies first”, it’s not an excuse to keep kicking babies. i just have to stop writing, or maybe find another workflow

FaceDeer,
FaceDeer avatar

The difference is that kicking babies is illegal whereas training and running an AI is not. Kind of a big difference.

Zeus,

did you click the thing saying that you understand analogies?

FaceDeer,
FaceDeer avatar

You're using an analogy as the basis for an argument. That's not what analogies are for. Analogies are useful explanatory tools, but only within a limited domain. Kicking a baby is not the same as creating an artwork, so there are areas in which they don't map to each other.

You can't dodge flaws in your argument by adding a "don't respond unless you agree with me" clause on your comment.

Zeus, (edited )

You’re using an analogy as the basis for an argument. That’s not what analogies are for. Analogies are useful explanatory tools, but only within a limited domain

actually that’s exactly what i was using it for.

Kicking a baby is not the same[^1] as creating an artwork, so there are areas in which they don’t map to each other.

if you read carefully, you’ll see that writing is analogous to creating an artwork, and kicking a baby is analogous to doing something that someone has asked you not to, and you’re continuing anyways. if you read even more carefully, you’ll see that i implied i wasn’t making a moral comment on ai, piracy, or even kicking babies

You can’t dodge flaws in your argument by adding a “don’t respond unless you agree with me” clause on your comment.

i didn’t intend to. i did it so i wouldn’t have to waste my time arguing with those who don’t understand analogies. however i seem to be doing that anyways, so if you’ll excuse me, i’m going to stop


edit: okay, i’ve been reading the rest of this thread, and you clearly don’t understand analogy. i have no idea why you clicked on my comment

[^1]: yes. analogous doesn’t mean “the same”. it means "able to draw demonstrative parallels between

TwilightVulpine,

Not at the point of generation, but at the point of training it was. One of the sticking points of AI for artists is that their developers didn't even bother to seek permission. They simply said it was too much work and crawled artists' galleries.

Even publicly displayed art can only be used for certain previously-established purposes. By default you can't use them for derivative works.

FaceDeer,
FaceDeer avatar

At the point of training it was viewing images that the artists had published in a public gallery. Nothing pirated at that point either. They don't need "permission" to do that, the images are on display.

Learning from art is one of the previously-established purposes you speak of. No "derivative work" is made when an AI trains a model, the model does not contain any copyrightable part of the imagery it is trained on.

Kichae,

Bring publicly viewable doesn't make them public domain. Bring able to see something doesn't give you the right to use it for literally any other reason.

Full stop.

My gods, you're such an insufferable bootlicking fanboy of bullshit code jockies. Make a good faith effort to actually understand why people dislike these exploitative assholes who are looking to make a buck off of other people's work for once, instead of just reflexively calling them all phillistines who "just don't understand".

Some of us work on machine learning systems for a living. We know what they are and how they work, and they're fucking regurgitation machines. And people deserve to have control over whether we use their works in our regurgitation machines.

TwilightVulpine,

Of course they need permission to process images. No computer system can merely "view" an image without at least creating a copy for temporary use, and the purposes for which that can be done are strictly defined. Doing whatever you want just because you have access to the image is often copyright infringement.

People have the right to learn from images available publicly for personal viewing. AI is not yet people. Your whole argument relies on anthropomorphizing a tool, but it wouldn't even be able to select images to train its model without human intervention, which is done with the intent to replicate the artist's work.

I'm not one to usually bat for copyright but the disregard AI proponents have for artists' rights and their livelihood has gone long past what's acceptable, like the article shows.

FaceDeer,
FaceDeer avatar

If I run an image from the web through a program that generates a histogram of how bright its pixels are, am I suddenly a dirty pirate?

TwilightVulpine,

If you run someone's artwork through a filter is it completely fine and new just because the output is not exactly like the input and it deletes the input after it's done processing?

There is a discussion to be made, in good faith, of where the line lies, what ought to be the rights of the audience and what ought to be the rights of the artists, and what ought to be the rights of platforms, and what ought to be the limits of AI. To be fair, that's a difficult situation to determine, because in many aspects copyright is already too overbearing. Legally, many pieces of fan art and even memes are copyright infringement. But on the flipside automating art away is too far to the other side. The reason why Copyright even exists, at least ideally, is so that the rights and livelihood of artists is protected and they are incentivized to continue creating.

Lets not pretend that is just analysis for the sake of academic understanding, there is a large amount of people who are feeding artists' works into AI with the express purpose of getting artworks in their style without compensating them, something many artists made clear they are not okay with. While they can't tell people not to practice styles like theirs, they can definitely tell people not to use their works in ways they do not allow.

FaceDeer,
FaceDeer avatar

If you run someone's artwork through a filter is it completely fine and new just because the output is not exactly like the input and it deletes the input after it's done processing?

No, that's a derivative work. An analysis of the brightness of the pixels is not a derivative work.

There is a discussion to be made, in good faith, of where the line lies, what ought to be the rights of the audience and what ought to be the rights of the artists, and what ought to be the rights of platforms, and what ought to be the limits of AI.

Sure, but the people crying "You're stealing art!" are not making a good faith argument. They're using an inaccurate, prejudicial word for the purpose of riling up an emotional response. Or perhaps they just don't understand what copyright is and why it is, which also puts their argument in a bad state.

The reason why Copyright even exists, at least ideally, is so that the rights and livelihood of artists is protected and they are incentivized to continue creating.

Case in point. That's not why copyright exists. The reason for the American version of copyright is established right in the constitution: "To promote the progress of science and useful arts". If you want to go more fundamental than just what the US is up to, the original Statute of Anne was titled "An Act for the Encouragement of Learning".

The purpose of copyright is not to protect the rights or livelihood of artists. The protection of the rights and livelihood of artist is a means to the actual purpose of copyright, which is to enrich the public domain by prompting artists to be productive and to publish their works.

An artist that opposes AIs like these is now actively hindering the enrichment of the public domain.

Backspacecentury,

Wow.. so in your mind there is basically no copyright and nobody owns anything. That is incredibly reductive and completely ignores centuries of legal precedence since the constitution was written.

You are basically claiming that anything that is ever put on display anywhere, ever is public domain and that piracy doesn't exist.

FaceDeer,
FaceDeer avatar

No, I'm not claiming that and I have no idea how you're managing to come to that conclusion from what I wrote. There's no connection I can discern.

Kichae,

Because it's a required assumption to make anything you say on the subject make any sense. The fact that you deny that had convinced me that you're just a troll.

TwilightVulpine,

A histogram cannot output similar images, it's pointless to argue the fine details of an analogy that doesn't apply to begin with

To call it "stealing" might be inaccurate, but are the artists wrong to say that their intellectual property rights are being violated, when people using their works without consent to train AIs with the express purpose of replicating those artists' works? I have seen several artists pointing out AI users who brag to them that they are explicitly training AIs using those artists' galleries and show that it's outputting similar works.

The reason why Copyright even exists, at least ideally, is so that the rights and livelihood of artists is protected and they are incentivized to continue creating.

Case in point. That's not why copyright exists. The reason for the American version of copyright is established right in the constitution: "To promote the progress of science and useful arts".

How is it "promoting the progress of useful arts" not the same as "incentivizing artists to continue creating"? Are you going to argue what's "useful"? If there is interest in replicating artists' styles with AI, then that is an admission the people doing it see use in those works. Otherwise, it's the same, and protecting their livelihoods through the privilege of a temporary intellectual monopoly is how that promotion of arts is done.

I definitely see the value of the Public Domain, but if expanding it at any cost was the primary goal of copyright we wouldn't have roughly century-long copyright. Which I don't think is good per see but that's another discussion. Still, the existence of copyright at all is a concession that grants that for artists and creators to develop their works and ultimately enrich humanity's culture, they need to be able to control their works and have a guarantee to a stable career, to the extent that they can sell their own work. It's a protection so that not everyone can show up imitating that artist and undercut them, undermining their capability to make new creative works. Which is what many people have been doing with AI.

If anything that could enrich the Public Domain was enough reason to drop Copyright, we wouldn't have any Copyright. The compromise is that Public Domain as a whole will be enriched when the artist's Copyright expires.

FaceDeer,
FaceDeer avatar

They were not used for derivative works. The AI's model produced by the training does not contain any copyrighted material.

If you click this link and view the images there then you are just as much a "pirate" as the AI trainers.

TwilightVulpine,

The models themselves are the derivative works. Those artists' works were copied and processed to create that model. There is a difference between a person viewing a piece of work and putting that work to be processed through a system. The way copyright works as defined, being allowed to view a work is not the same as being allowed to use it in any way you see fit. It's also innacurate to speak of AIs as if they have the same abilities and rights as people.

Pulse,

Yes, it was.

One human artist can, over a life time, learn from a few artists to inform their style.

These AI setups are telling ALL the art from ALL the artists and using them as part of a for profit business.

There is no ethical stance for letting billion dollar tech firms hoover up all the art ever created to the try and remix it for profit.

FaceDeer,
FaceDeer avatar

No, it wasn't. Theft is a well-defined word. When you steal something you take it away from them so that they don't have it any more.

It wasn't even a case of copyright violation, because no copies of any of Rutkowski's art were made. The model does not contain a copy of any of the training data (with an asterisk for the case of overfitting, which is very rare and which trainers do their best to avoid). The art it produces in Rutkowski's style is also not a copyright violation because you can't copyright a style.

There is no ethical stance for letting billion dollar tech firms hoover up all the art ever created to the try and remix it for profit.

So how about the open-source models? Or in this specific instance, the guy who made a LoRA for mimicking Rutkowski's style, since he did it free of charge and released it for anyone to use?

Pulse,

Yes copies were made. The files were downloaded, one way or another (even as a hash, or whatever digital asset they claim to translate them into) then fed to their machines.

If I go into a Ford plant, take pictures of their equipment, then use those to make my own machines, it’s still IP theft, even if I didn’t walk out with the machine.

Make all the excuses you want, you’re supporting the theft of other people’s life’s work then trying to claim it’s ethical.

ricecake,

Copies that were freely shared for the purpose of letting anyone look at them.

Do you think it’s copyright infringement to go to a website?

Typically, ephemeral copies that aren’t kept for a substantial period of time aren’t considered copyright violations, otherwise viewing a website would be a copyright violation for every image appearing on that site.

Downloading a freely published image to run an algorithm on it and then deleting it without distribution is basically the canonical example of ephemeral.

storksforlegs,
@storksforlegs@beehaw.org avatar

Its what you do with the copies thats the problem, not the physical act of copying.

FaceDeer,
FaceDeer avatar

Yes copies were made. The files were downloaded, one way or another (even as a hash, or whatever digital asset they claim to translate them into) then fed to their machines.

They were put on the Internet for that very purpose. When you visit a website and view an image there a copy of it is made in your computer's memory. If that's a copyright violation then everyone's equally boned. When you click this link you're doing exactly the same thing.

M0RNlNGW00D,

For disclosure I am a former member of the American Photographic Artists/Advertising Photographers of America, and I have works registered at the United States Copyright Office.

When we put works in our online portfolio, send mailers or physical copies of our portfolios we're doing it as promotional works. There is no usage license attached to it. If loaded into memory for personal viewing, that's fine since its not a commercial application nor violating the intent of that specific release: viewing for promotion.

Let's break down your example to help you understand what is actually going on. When we upload our works to third party galleries there is often a clause in the terms of service which states the artist uploading to the site grants a usage license for distribution and displaying of the image. Let's look at Section 17 of ArtStation's Terms of Service:

  1. License regarding Your Content

Your Content may be shared with third parties, for example, on social media sites to promote Your Content on the Site, and may be available for purchase through the Marketplace. You hereby grant royalty-free, perpetual, world-wide, licenses (the “Licenses”) to Epic and our service providers to use, copy, modify, reformat and distribute Your Content, and to use the name that you provide in association with Your Content, in connection with providing the Services; and to Epic and our service providers, members, users and licensees to use, communicate, share, and display Your Content (in whole or in part) subject to our policies, as those policies are amended from time-to-time

This is in conjunction with Section 16's opening line:

  1. Ownership

As between you and Epic, you will retain ownership of all original text, images, videos, messages, comments, ratings, reviews and other original content you provide on or through the Site, including Digital Products and descriptions of your Digital Products and Hard Products (collectively, “Your Content”), and all intellectual property rights in Your Content.

So when I click your link, I'm not engaging in a copyright violation. I'm making use of ArtStation's/Epic's license to distribute the original artist's works. When I save images from ArtStation that license does not transfer to me. Meaning if I were to repurpose that work it could be a copyright violation depending on the usage the artist agrees to. Established law states that I hold onto the rights of my work and any usage depends on what I explicitly state and agree to; emphasis on explicitly because the law will respect my terms and compensation first, and your intentions second. For example, if a magazine uses my images for several months without a license, I can document the usage time frame, send them an invoice, and begin negotiating because their legal team will realize that without a license they have no footing.

  • Yes, this also applies to journalism as well. If you've agreed to let a news outlet use your works on a breaking story for credit/exposure, then you provided a license for fair compensation in the form of credit/exposure.

I know this seems strange given how the internet freely transformed works for decades without repercussions. But as you know from sites like YouTube copyright holders are not a fan of people repurposing their works without a mutually agreed upon terms in the form of a license. If you remember the old show Mystery Science Theater 3000, they operated in the proper form: get license, transform work, commercialize. In the case of ArtStation, the site agrees to provide free hosting in compensation for the artist providing a license to distribute the work without terms for monetization unless agreed upon through ArtStation's marketplace. At every step, the artist's rights to their work is respected and compensated when the law is applied.

If all this makes sense and we look back at AI art, well...

FaceDeer,
FaceDeer avatar

Meaning if I were to repurpose that work it could be a copyright violation depending on the usage the artist agrees to.

Training an AI doesn't "repurpose" that work, though. The AI learns concepts from it and then the work is discarded. No copyrighted part of the work remains in the AI's model. All that verbiage doesn't really apply to what's being done with the images when an AI trains on them, they are no longer being "used" for anything at all after training is done. Just like when a human artist looks at some reference images and then creates his own original work based on what he's learned from them.

TwilightVulpine,

Here is where a rhethorical sleight of hand is used by AI proponents.

It's displayed for people's appreciation. AI is not people, it is a tool. It's not entitled to the same rights as people, and the model it creates based on artists works is itself a derivative work.

Even among AI proponents, few believe that the AI itself is an autonomous being who ought to have rights over their own artworks, least of all the AI creators.

FaceDeer,
FaceDeer avatar

I use tools such as web browsers to view art. AI is a tool too. There's no sleight of hand, AI doesn't have to be an "autonomous being." Training is just a mechanism for analyzing art. If I wrote a program that analyzed pictures to determine what the predominant colour in them was that'd be much the same, there'd be no problem with me running it on every image I came across on a public gallery.

TwilightVulpine,

You wouldn't even be able to point a camera to works in public galleries without permission. Free for viewing doesn't mean free to do whatever you want with them, and many artists have made clear they never gave permission that their works would be used to train AIs.

Harrison,

Once you display an idea in public, it belongs to anyone who sees it.

Pulse,

By that logic I can sell anything I download from the web while also claiming credit for it, right?

Downloading to view != downloading to fuel my business.

FaceDeer,
FaceDeer avatar

No, and that's such a ridiculous leap of logic that I can't come up with anything else to say except no. Just no. What gave you that idea?

Pulse,

Because this thread was about the companies taking art feeding it into their machine a D claiming not to have stolen it.

Then you compared that to clicking a link.

FaceDeer,
FaceDeer avatar

Yes, because it's comparable to clicking a link.

You said:

By that logic I can sell anything I download from the web while also claiming credit for it, right?

And that's the logic I can't follow. Who's downloading and selling Rutkowski's work? Who's claiming credit for it? None of that is being done in the first place, let alone being claimed to be "ok."

Pulse,

Because that is what they’re doing, just with extra steps.

The company pulled down his work, fed it to their AI, then sold the AI as their product.

Their AI wouldn’t work, at all, without the art they “clicked on”.

So there is a difference between me viewing an image in my browser and me turning their work into something for resell under my name. Adding extra steps doesn’t change that.

FaceDeer,
FaceDeer avatar

The company pulled down his work, fed it to their AI, then sold the AI as their product.

If you read the article, not even that is what's going on here. Stability AI:

  • Removed Rutkowski's art from their training set.
  • Doesn't sell their AI as a product.
  • Someone else added Rutkowski back in by training a LoRA on top of Stability's AI.
  • They aren't selling their LoRA as a product either.

So none of what you're objecting to is actually happening. All cool? Or will you just come up with some other thing to object to?

Pulse,

But they did.

(I’m on mobile so my formatting is meh)

They put his art in, only when called out did they remove it.

Once removed, they did nothing to prevent it being added back.

As for them selling the product, or not, at this point, they still used the output of his labor to build their product.

That’s the thing, everyone trying to justify why it’s okay for these companies to do it keep leaning on semantics, legal definitions or “well, back during the industrial revolution…” to try and get around the fact that what these companies are doing is unethical. They’re taking someone else’s labor, without compensation or consent.

amju_wolf,
@amju_wolf@pawb.social avatar

No, but you can download Rutkovski’s art, learn from it how to paint in his exact style and create art in that style.

Which is exactly what the image generation AIs do. They’re perhaps just a bit too good at it, certainly way better than an average human.

Which makes it complicated and morally questionable depending on how exactly you arrive at the model and what you do with it, but you can’t definitively say it’s copyright infringement.

adespoton,

What makes it even trickier is that taking AI generated art and using it however you want definitively isn’t copyright infringement because only works by humans can be protected by copyright.

Pulse,

But that’s not what they did, converting it into a set of instructions a computer can use to recreate it is just adding steps.

And, yes, that’s what they’ve done else we wouldn’t find pieces of others works mixed in.

Also, even if that was how it worked, it’s still theft of someone’s else’s labor to feed your business.

If it wasn’t, they would have asked for permission first.

Pulse,

I think my initial reply to you was meant to go somewhere else but Connect keeps dropping me to the bottom of the thread instead of where the reply I’m trying to get to is.

I’m going to leave it (for consistency sake) but I don’t think it makes much sense as a reply to your post.

Sorry about that!

Pulse,

You keep comparing what one person, given MONTHS or YEARS of their life could do with one artists work to a machine doing NOT THE SAME THING can do with thousands of artists work.

The machine is not learning their style, it’s taking pieces of the work and dropping it in with other people’s work then trying to blend it into a cohesive whole.

The analogy fails all over the place.

And I don’t care about copyright, I’m not an artist or an IP lawyer, or whatever. I can just look at a company stealing the labor of an entire industry and see it as bad.

FaceDeer,
FaceDeer avatar

The speed doesn't factor into it. Modern machines can stamp out metal parts vastly faster than blacksmiths with a hammer and anvil can, are those machines doing something wrong?

Pulse,

The machine didn’t take the blacksmiths work product and flood the market with copies.

The machine wasn’t fed 10,000 blacksmith made hammers then told to, sorta, copy those.

Justify this all you want, throw all the bad analogies at it you want, it’s still bad.

Again, if this wasn’t bad, the companies would have asked for permission. They didn’t.

FaceDeer,
FaceDeer avatar

That's not the aspect you were arguing about in the comment I'm responding to. You said:

You keep comparing what one person, given MONTHS or YEARS of their life could do with one artists work to a machine doing NOT THE SAME THING can do with thousands of artists work.

And that's what I'm talking about here. The speed with which the machine does its work is immaterial.

Though frankly, if the machine stamping out parts had somehow "learned" how to do it by looking at thousands of existing parts, that would be fine too. So I don't see any problem here.

Pulse,

And that’s where we have a fundamental difference of opinion.

A company hiring an engineer to design a machine that makes hammers, then hiring one (or more) people to make the machine to then make hammers is the company benefiting from the work product of people they hired. While this may impact the blacksmith they did not steal from the blacksmith.

A company taking someone else’s work product to then build their product, without compensation or consent, is theft of labor.

I don’t see those as equitable situations.

FaceDeer,
FaceDeer avatar

At least now you're admitting that it's a difference of opinion, that's progress.

You think it should be illegal to do this stuff. Fine. I think copyright duration has been extended ridiculously long and should be a flat 30 years at most. But in both cases our opinions differ from what the law actually says. Right now there's nothing illegal about training an AI off of someone's lawfully-obtained published work, which is what was done here.

Pulse,

I’m not a fan of our copyright system. IMO, it’s far to long and should also include clauses that place anything not available for (easy) access in the public domain.

Also, I’m not talking about what laws say, should say or anything like that.

I’ve just been sharing my opinion that it’s unethical and I’ve not seen any good explanation for how stealing someone else’s labor is “good”.

TwilightVulpine,

Speed aside, machines don't have the same rights as humans do, so the idea that they are "learning like a person so it's fine" is like saying a photocopier machine's output ought to be treated as an independent work because it replicated some other work, and it's just so good and fast at it. AI's may not output identical work, but they still rely on taking an artist's work as input, something the creator ought to have a say over.

jarfil,

One human artist can, over a life time, learn from a few artists to inform their style.

These AI setups […] ALL the art from ALL the artists

So humans are slow and inefficient, what’s new?

First the machines replaced hand weavers, then ice sellers went bust, all the calculators got sacked, now it’s time for the artists.

There is no ethical stance for letting billion dollar tech firms hoover up all the art ever created to the try and remix it for profit.

We stand on the shoulders of generations of unethical stances.

Pulse,

“other people were bad so I should be bad to.”

Cool.

storksforlegs,
@storksforlegs@beehaw.org avatar

Yes, which is why we should try to do better.

kitonthenet, to technology in Greg Rutkowski Was Removed From Stable Diffusion, But AI Artists Brought Him Back - Decrypt

what I'm getting from all the AI stuff is the people in charge and the people that use it are scumbags

kboy101222,

Welcome to the wonderful world of the silicon valley tech era! Everything must be profitable at all costs! Everything must steal every tiny fact about you! Everything must include ! Everything must go through enshittification!

MossyFeathers, (edited )

Pretty much. There are ways of using it that most artists would be okay with. Most of the people using it flat out refuse to use it like that though.

Edit: To expand on this:

Most artists would be okay with AI art being used as reference material, inspiration, assisting with fleshing out concepts (though you should use concept artists for that in a big production), rapid prototyping and whatnot. Most only care that the final product is at least mostly human-made.

Artists generally want you to actually put effort into what you’re making because, at the end of the day, typing a prompt into stable diffusion has more in common with receiving a free commission from an artist than it has with actually being an artist. If you’re going to claim that something AI had a hand in as being your art, then you need to have done the majority of the work on it yourself.

The most frustrating thing to me, however, is that there are places in art that AI could participate in which would send artists over the moon, but it’s not flashy so no one seems to be working on making AI in those areas.

Most of what I’m personally familiar with has to do with 3d modeling, and in that discipline, people would go nuts if you released an AI tool that could do the UV work for you. Messing with UVs can be very tedious and annoying, to the point where most artists will just use a tool using conventional algorithms to auto-unwrap and pack UVs, and then call it a day, even if they’re not great.

Another area is in rigging and weight painting. In order to animate a model, you have to rig it to a skeleton (unless you’re a masochist or trying to make a game accurate to late 90s-early 00s animation), paint the bone weights (which bones affect which polygons, and by how much), add constraints, etc. Most 3d modelers would leap at the prospect of having high-quality rigging and UVs done for them at the touch of a button. However, again, because it’s not flashy to the general public, no one’s put any effort into making an AI that can do that (afaik at least).

Finally, even if you do use an AI in ways that most artists would accept as valid, you’ll still have to prove it because there are so many people who put a prompt into stable diffusion, do some minor edits to fix hands (in older version), and then try to pass it off as their own work.

DekkerNSFW,

Sadly, AI isn't as good with sparse data like vertices and bones, so most attempts to use AI on 3D stuff is via NERFs, which is closer to a "photo" you can walk around in than to an actual 3D scene.

AzureDusk10, to technology in Greg Rutkowski Was Removed From Stable Diffusion, But AI Artists Brought Him Back - Decrypt

The real issue here is the transfer of power away from the artist. This artist has presumably spent years and years perfecting his craft. Those efforts are now being used to line someone else’s pockets, in return for no compensation and a diminishment in the financial value of his work, and, by the sounds of it, little say in the matter either. That to me seems very unethical.

millie,

Personally, as an artist who spends the vast majority of their time on private projects that aren’t paid, I feel like it’s put power in my hands. It’s best at sprucing up existing work and saving huge amounts of time detailing. Because of stable diffusion I’ll be able to add those nice little touches and flashy bits to my work that a large corporation with no real vision has at their disposal.

To me it makes it much easier for smaller artists to compete, leveling the playing field a bit between those with massive resources and those with modest resources. That can only be a good thing in the long run.

But I also feel like copyright more often than not rewards the greedy and stifles the creative.

moon_matter,
moon_matter avatar

But that's sort of the nature of the beast when you put your content up for free on a public website. Does Kbin or Beehaw owe us money for our comments on this thread? What about everyone currently reading? At least KBin and Beehaw are making profit off of this.

The argument is not as clear cut as people are making it sound and it has potential to up-end some fundamental expectations around free websites and user-generated content. It's going to affect far more than just AI.

jarfil,

At least KBin and Beehaw are making profit off of this.

How?

lolola, to nostupidquestions in Should there be an "ALL OFF" button to instantly shut down all these new AI Defense bots that the Military in the US want to build and deploy in the thousands?
@lolola@lemmy.blahaj.zone avatar

Maybe can we take a step back and ask whether we need thousands of AI defense bots at all? Or are we past that point?

Wookie,

Realistically, who’s gonna stop them?

Something_Complex,

I believe in you, or should I say, gggrrrrrrruuuuuuuuu(pardon my wokie I only had one semester in college)

Zippy, (edited )

I don’t like it but I think we do. China and Russia will certainly have them and they will get ten times better in a same amount of years.

I watched the Ted talk on defense drivers. Scary shit. Thing is I work with commercial cameras and have, in hand, camera that can not only identify all kinds of objects such a human’s, they can recognize individual humans and put a name to them. They can recognize if people are loitering or if someone is being followed. They can reconsider a car from a truck from a bus from a bike. This is not done in a server but thru the power of the CPU in the camera alone. The cost. 500 dollars.

Point being the power available in such a low cost item is staggering. Combined with a weapons platform and it is scary. A terrorist group could distribute hundreds into bushes and they could just sit there for a week in low power mode, waiting to recognize a simple person and spring into action. This is stuff we have right now off the shelf.

What will be part of military arsenals in ten years will eclipse this current tech significantly. Troops won’t be ambushed by live human fire but by thousands of drones that care not for their survival.

Anticorp,

Can I get a few of those cameras and have them record any people around my house that aren’t me? What are they called?

Zippy,

Hikvision line mainly if you want the low cost ones.

They won’t do a ‘not me’ identification. Mainly because it can only identify you or any person if they get a decent view of you. Basically the first event will be ‘i see a human’ and if you look at the camera then it can also do an event and say basically ‘Jack black’ is here. It is two different kind of events you need to turn on. But the person recognition can only fire if it recognizes you.

I thought the same thing is you in that I could have it ignore known people. But it like you looking out a window. You see someone from a distance. To recognize them you need them to come closer. Thus as a person you don’t call the cops or create an event immediately but at some point you might. The cameras are not quite that smart yet but as said, ten years?

kava,

What I think is dangerous is terrorists or mass killers getting dozens or hundreds of small drones and installing explosives on them. Install these cameras and CPUs you mentioned that can recognize human faces and have them fly into someone’s face and then explode.

You could kill many people and unless we start installing AA turrets all over our populated cities, there seems to be little we can do to stop it.

Zippy,

I had mentioned that in an earlier post. It is pretty scary. They could sit in a bush for a month using extremely low power motion detection. See any motion turn on camera to look for human recognition.

sdoorex,

It’s the nuclear arms race part deux: AI boogaloo.

JayDee,

Autonomous drones made by China have been used in Papua New Guinea to bomb at least one village so I think the US is actually behind the curve in terms of the AI arms race.

This is one of those classical sci-fi apocalypse ideas, where humans make autonomous war machines they can’t turn off, and the machines outlive the humans and continue the war for them.

astraeus,
@astraeus@programming.dev avatar

How is it that when it comes to reckless ideas and notions Congress takes millions of years and the Pentagon takes no more than three business days to implement?

RygelTheDom, to technology in Greg Rutkowski Was Removed From Stable Diffusion, But AI Artists Brought Him Back - Decrypt

What blurry line? An artist doesn’t what his art stolen from him. Seems pretty cut and dry to me.

falsem,

If I look at someone's paintings, then paint something in a similar style did I steal their work? Or did I take inspiration from it?

Pulse,

No, you used it to inform your style.

You didn’t drop his art on to a screenprinter, smash someone else’s art on top, then try to sell t-shirts.

Trying to compare any of this to how one, individual, human learns is such a wildly inaccurate way to justify stealing a someone’s else’s work product.

falsem,

If it works correctly it's not a screenprinter, it's something unique as the output.

Pulse,

The fact that folks can identify the source of various parts of the output, and that intact watermarks have shown up, shows that it doesn’t work like you think it does.

jarfil,

Does that mean the AI is not smart enough to remove watermarks, or that it’s so smart it can reproduce them?

falsem,

It means that it's stupid enough that it reproduces them - poorly.

Swedneck,
@Swedneck@discuss.tchncs.de avatar

It’s like staring yourself blind at artworks with watermarks until you start seeing artworks with blurry watermarks in your dreams

TheBurlapBandit,

It’s not smart or stupid. It does what it’s been trained on, nothing more.

nickwitha_k,

LLMs and directly related technologies are not AI and possess no intelligence or capability to comprehend, despite the hype. So, they are absolutely the former, though it’s rather like a bandwagon sort of thing (x number of reference images had a watermark, so that’s what the generated image should have).

jarfil,

LLMs […] no intelligence or capability to comprehend

That’s debatable. LLMs have shown emergent behaviors aside from what was trained, and they seem to be capable of comprehending relationships between all sorts of tokens, including multi-modal ones.

Anyway, Stable diffusion is not an LLM, it’s more of a “neural network hallucination machine” with some cool hallucinations, that sometimes happen to be really close to some or parts of the input data. It still needs to be “smart” enough to decompose the original data into enough and the right patterns, that it can reconstruct part of the original from the patterns alone.

nickwitha_k,

Thanks for the clarification!

LLMs have indeed shown interesting behaviors but, from my experience with the technology and how it works, I would say that any claims of intelligence being possessed by a system that is only an LLM would be suspect and require extraordinary evidence to prove that it is not mistaken anthropomorphizing.

jarfil,

I don’t think an LLM alone can be intelligent… but I do think it can be the central building block for a sentient self-aware intelligent system.

Humans can be thought of as being made of a set of field-specific neural networks, tied together by a looping self-evaluating multi-modal LLM that we call “conscience”. The ability of an LLM to consume its own output, is what allows it to be used as the conscience loop, and current LLMs being trained on human language with all its human nuance, is an extra bonus.

Probably some other non-text multi-modal neural networks capable of consuming their own output could also be developed and be put in a loop, but right now we have LLMs, and we kind of understand most of what they’re saying, and they kind of understand most of what we’re saying, so that makes communication easier.

I mean, it is anthropomorphizing, but in this case I think it makes sense because it’s also anthropogenic, since these human language LLMs get trained on human language.

nickwitha_k,

Absolutely agreed with most of that. I think that LLMs and similar technologies are incredible and have great potential to be components of artificial intelligences. LLMs by themselves are more akin to “virtual intelligences” portrayed in the Mass Effect games, but currently generally with fewer guard rails to prevent hallucinations.

I suspect there may be a few other concurrent “loops”, likely not as well compared to LLMs (though some might be) running in our meat computers and their inefficiency and poor fidelity likely ends up being part of the factors that make our consciousness. Otherwise, your approximation makes a lot of sense. Still a lot to learn about our meat computers but, I really do hope we, as a species, succeed in making the world a bit less lonely (by helping other intelligence emerge).

jarfil,

There is some discussion about people “with an internal monologue”, and people “without”. I wonder if those might be some different ways of running that loop, or maybe some people have one loop take over others… and the whole “dissociative personality disorder” could be multiple loops competing for being the main one at different times.

Related to fidelity, some time ago I read an interesting thing: consciousness means having brainwaves out of sync, when they get in sync people go unconscious. From a background in electronics, I’ve always assumed the opposite (system clock and such), but apparently our consciousness emerges from the asynchronous differences, meaning the inefficiencies and poor fidelity might be a feature, not a bug.

Anyway, right now, as someone suffering from insomnia, I’d happily merge with some AI just to get a “pause” button.

FaceDeer,
FaceDeer avatar

They can't, and "intact" watermarks don't show up. You're the one who is misunderstanding how this works.

When a pattern is present very frequently the AI can learn to imitate it, resulting in things that closely resemble known watermarks. This is called "overfitting" and is avoided as much as possible. But even in those cases, if you examine the watermark-like pattern closely you'll see that it's usually quite badly distorted and only vaguely watermark-like.

Pulse,

Yes, because “imitate” and “copy” are different things when stealing from someone.

I do understand how it works, the “overfitting” was just laying clear what it does. It copies but tries to sample things in a way that won’t look like clear copies. It had no creativity, it is trying to find new ways of making copies.

If any of this was ethical, the companies doing it would have just asked for permission. That they didn’t says a everything you need to know.

I don’t usually have these kinds discussions anymore, I got tired of conversations like this back in 2016, when it became clear that people will go to the ends of the earth to justify unethical behavior as long as the people being hurt by it are people they don’t care about.

FaceDeer,
FaceDeer avatar

And we're back to you calling it "stealing", which it certainly is not. Even if it was copyright violation, copyright violation is not stealing.

You should try to get the basic terminology right, at the very least.

Pulse,

Just because you’ve redefined theft in a way that makes you feel okay about it doesn’t change what they did.

They took someone else’s work product, fed it into their machine then used that to make money.

They stole someone’s labor.

FaceDeer,
FaceDeer avatar

I haven't "redefined" it, I'm using the legal definition. People do sometimes sloppily equate copyright violation with theft in common parlance, but they're in for a rude awakening if they intend to try translating that into legal action.

Using that term in an argument like this is merely trying to beg the question of whether it's wrong, since most everyone agrees that stealing is wrong you're trying to cast the action of training an AI as something everyone will by default agree is wrong. But it's not stealing, no matter how much you want it to be, and I'm calling that rhetorical trick out here.

If you want to argue that it's wrong you need to argue against the actual process that's happening, not some magical scenario where the AI trainers are somehow literally robbing people.

Pulse,

Taking someone’s work product and converting it, without compensation and consent, into your profit is theft of labor.

Adding extra steps, like, say, training an AI, doesn’t absolve the theft of labor.

We’re it ethical, the companies doing it would have asked for permission and been given cinsent. They didn’t.

FaceDeer,
FaceDeer avatar

Taking someone’s work product and converting it, without compensation and consent, into your profit is theft of labor.

That's not what's going on here. The finished product contains only the style of the artist that the AI was trained on, and style is not copyrightable. Which is a damn good thing, as humans have been learning from each other's "work products" and mimicking each others' styles since time immemorial.

BTW, theft of labor means failing to pay wages or provide employee benefits owed to an employee by contract or law. You're using that term incorrectly too, Greg Rutkowski wasn't hired to do anything for the people who trained the AI off of his work.

Pulse,

No, I’m not using it incorrectly, I’m just not concerned with the legal definition as I’m not a lawyer or anyone tied up in this mess.

If you do a thing, and it takes time and skill to do it, then someone copies it, they stole your labor.

Saying they “copied his style”, the style he spent a lifetime crafting, then trying to say they didn’t benefit, at no cost, to the labor he put into crafting that style because “well actually, the law says…” is a bad argument as it tries to minimize what they did.

If their product could not exist without his labor, and they did not pay him for that labor, they stole his labor.

For, like, the fourth time in this thread: were this ethical, they would have asked for permission, they didn’t.

FaceDeer,
FaceDeer avatar

If you're just going to make up the meanings of words there's not much point in using them any further.

Pulse,

But I’m not.

You’re trying to say that, because this one law doesn’t say it’s bad it must therefore be good (or at least okay).

I’m simply saying that if you profit from someone else’s labor, without compensating them (or at least getting their consent), you’ve stolen the output of that labor.

I’m happy to be done with this, I didn’t expect my first Lemmy comment to get any attention, but no, I’m not going to suddenly be okay with this just because the legal definition of “stealing labor” is to narrow to fit this scenario.

whelmer,

The law doesn’t even say it’s okay. What FaceDeer is referring to is that copyright infringement is a different category of crime than theft, which is defined as pertaining to physical property. It’s a meaningless point because, as you said, this isn’t a courtroom and we aren’t lawyers and the concept of intellectual property theft is well understood.

It’s a thing engineers and lawyers often seem to do, to take the way terms are used in a particular professional jargon and assume that that usage is “the real” usage.

fades,

I don’t disagree but stolen is a bit of a stretch

teichflamme,

Nothing was stolen.

Drawing inspiration from someone else by looking at their work has been around for centuries.

Imagine if the Renaissance couldn’t happen because artists didn’t want their style stolen.

FaceDeer,
FaceDeer avatar

His art was not "stolen."

KoboldCoterie,
@KoboldCoterie@pawb.social avatar

I don’t fully understand how this works, but if they’ve created a way to replicate his style that doesn’t involve using his art in the model, how is it problematic? I understand not wanting models to be trained using his art, but he doesn’t have exclusive rights to the art style, and if someone else can replicate it, what’s the problem?

This is an honest question, I don’t know enough about this topic to make a case for either side.

Hubi,

You’re pretty spot on. It’s not much different from a human artist trying to copy his style by hand but without reproducing the actual drawings.

delollipop,

Do you know how they recreated his style? I couldn’t find such information or frankly have enough understanding to know how.

But if they either use his works directly or works created by another GAI with his name/style in the prompt, my personal feeling is that would still be unethical, especially if they charge money to generate his style of art without compensating him.

Plus, I find that the opt-out mentality really creepy and disrespectful

“If he contacts me asking for removal, I’ll remove this.” Lykon said. “At the moment I believe that having an accurate immortal depiction of his style is in everyone’s best interest.”

fsniper,

I still have trouble understanding the distinction between "a human consuming different artists, and replicating the style" vs "software consuming different artists, and replicating the style".

Otome-chan,
Otome-chan avatar

there's no distinction. people are just robophobic.

KoboldCoterie,
@KoboldCoterie@pawb.social avatar

Do you know how they recreated his style? I couldn’t find such information or frankly have enough understanding to know how.

I don’t, but another poster noted that it involves using his art to create the LoRA.

Plus, I find that the opt-out mentality really creepy and disrespectful

I don’t know about creepy and disrespectful, but it does feel like they’re saying “I know the artist doesn’t want me to do this, but if he doesn’t specifically ask me personally to stop, I’m going to do it anyway.”

averyminya,

But if they either use his works directly or works created by another GAI with his name/style in the prompt, my personal feeling is that would still be unethical, especially if they charge money to generate his style of art without compensating him.

LORA’s are created on image datasets, but these images are just available anywhere. It’s really not much different from you taking every still of The Simpsons and using it. What I don’t understand is how these are seen as problematic because a majority of end users utilizing AI are doing it under fair use.

No one charges for LORA’s or models AFAIK. If they do, it hasn’t come across the Stable Diffusion discords I moderate.

People actually selling AI generated art is also a different story and that’s where it falls outside of fair use if the models being used contain copy-written work. It seems pretty cut and dry, artists complained about not being emulated by other artists before AI so it’s only reasonable that it happens again. If people are profiting off it, it should be at least giving compensation to the original artist (if it could be adjusted so that per-token payments are given as royalties to the artist). However, on the other hand think about The Simpsons, or Pokemon, or anything that has ever been sold as a sticker/poster/display item.

I’m gonna guess that a majority of people have no problem with that IP theft cause it’s a big company. Okay… so what if I love Greg but he doesn’t respond to my letters and e-mails begging him to commission him for a Pokemon Rutkowski piece? Under fair use there’s no reason I can’t create that on my own, and if that means creating a dataset of all of his paintings that I paid for to utilize it then it’s technically legal.

The only thing here that would be unethical or illegal is if his works are copywritten and being redistributed. They aren’t being redistributed and currently copy-written materials aren’t protected from being used in AI models, since the work done from AI can’t be copywritten. In other words, while it may be disrespectful to go against the artists wishes to not be used in AI, there’s no current grounds for it other than an artist not wanting to be copied… which is a tale as old as time.

TL;DR model and LORA makers aren’t charging, users can’t sell or copywrite AI works, and copywritten works aren’t protected from being used in AI models (currently). An artist not wanting to be used currently has no grounds other than making strikes against anything that is redistributing copies of their work. If someone is using this LORA to recreate Greg Rutkowski paintings and then proceeds to give or sell them then the artist is able to claim that there’s theft and damages… but the likelihood of an AI model being able to do this is low. The likelihood of someone selling these is higher, but from my understanding artistic styles are pretty much fair game anyway you swing it.

I understand wanting to protect artists. Artists also get overly defensive at times - I’m not saying that this guy is I actually am more on his side than my comment makes it out, especially after how he was treated in the discord I moderate. I’m more just pointing out that there’s a slippery slope both ways and the current state of U.S. law on it.

SweetAIBelle,
SweetAIBelle avatar

Generally speaking, the way training works is this:
You put together a folder of pictures, all the same size. It would've been 1024x1024 in this case. Other models have used 768z768 or 512x512. For every picture, you also have a text file with a description.

The training software takes a picture, slices it into squares, generates a square the same size of random noise, then trains on how to change that noise into that square. It associates that training with tokens from the description that went with that picture. And it keeps doing this.

Then later, when someone types a prompt into the software, it tokenizes it, generates more random noise, and uses the denoising methods associated with the tokens you typed in. The pictures in the folder aren't actually kept by it anywhere.

From the side of the person doing the training, it's just put together the pictures and descriptions, set some settings, and let the training software do its work, though.

(No money involved in this one. One person trained it and plopped it on a website where people can download loras for free...)

Rhaedas,
Rhaedas avatar

they charge money to generate his style of art without compensating him.

That's really the big thing, not just here but any material that's been used to train on without permission or compensation. The difference is that most of it is so subtle it can't be picked out, but an artist style is obviously a huge parameter since his name was being used to call out those particular training aspects during generations. It's a bit hypocritical to say you aren't stealing someone's work when you stick his actual name in the prompt. It doesn't really matter how many levels the art style has been laundered, it still originated from him.

conciselyverbose,

It is unconditionally impossible to own an artistic style. "Stealing a style" cannot be done.

snooggums,
snooggums avatar

Is drawing Mickey Mouse in a new pose copying the style or copying Mickey Mouse?

conciselyverbose,

The second.

I'm not sure how that's relevant here, though. There is nothing at all being copied but an aesthetic.

ricecake,

You said it yourself. You’re drawing Micky mouse in a new pose, so you’re copying Mickey mouse.

Drawing a cartoon in the style of Mickey mouse isn’t the same thing.

You can’t have a copyright on “big oversized smile, exaggerated posture, large facial features, oversized feet and hands, rounded contours and a smooth style of motion”.

Rhaedas,
Rhaedas avatar

And yet the artist's name is used to push the weights towards pictures in their style. I don't know what the correct semantics are for it, nor the legalities. That's part of the problem, the tech is ahead of our laws, as is usually the case.

conciselyverbose,

And yet the artist's name is used to push the weights towards pictures in their style.

That's not even vaguely new in the world of art.

Imitating style is the core of what art is. It's absolutely unconditionally protected by copyright law. It's not even a .01 out of 10 on the scale of unethical. It's what's supposed to happen.

The law might not cover this yet, but any law that restricts the fundamental right to build off of the ideas of others that are the core of the entirety of human civilization is unadulterated evil. There is no part of that that could possibly be acceptable to own.

Rhaedas,
Rhaedas avatar

I totally agree with you on protecting the basics of creativity and growth. I think the core issue is using "imitate" here. Is that what the LLM is doing, or is that an anthropomorphism of some sense that there's intelligence guiding the process? I know it seems like I'm nitpicking things to further my point, but the fact that this is an issue to many even outside artwork says there is a question here of what is and isn't okay.

conciselyverbose,

The AI is not intelligent. That doesn't matter.
Nothing anyone owns is being copied or redistributed. The creator isn't the tool; it's the person using the tool.

AI needs two things to work, an algorithm and data. If training is allowed to anyone, anyone can create their own algorithms and use the AI as a tool to create innovative new messages with some ideas borrowed from other work.

If data is proprietary, they cannot. But Disney still can. They'll just as successfully flood out all the artists who can't use AI because they don't have a data set, but now they and the two other companies in the world who own IP are basically a monopoly (or tri- or whatever) and everyone else is screwed.

altima_neo,
@altima_neo@lemmy.zip avatar

It’s only using his name because the person who created the LORA trained it with his name. They could have chosen any other word.

Rhaedas,
Rhaedas avatar

True, and then because it's a black box there wouldn't be a known issue at all. Or maybe it would be much less of an issue because the words might have blended others into the mix, and his style wouldn't be as obvious in the outputs, and/or it would be easier to dismiss. Did the training involve actual input of his name, or was that pulled from the source trained on? How much control was in the training?

Peanutbjelly, (edited )

Just wait until you can copywrite a style. Guess who will end up owning all the styles.

Spoiler, it’s wealthy companies like Disney and Warner. Oh you used cross hatching? Disney owns the style now you theif.

Copyright is fucked. Has been since before the Mickey mouse protection act. Our economic system is fucked. People would rather fight each other and new tools instead of rallying against the actual problem, and it’s getting to me.

Pseu,

You’re right, copyright won’t fix it, copyright will just enable large companies to activate more of their work extract more from the creative space.

But who will benefit the most from AI? The artists seem to be getting screwed right now, and I’m pretty sure that Hasbro and Disney will love to cut costs and lay off artists as soon as this blows over.

Technology is capital, and in a capitalist system, that goes to benefit the holders of that capital. No matter how you cut it, laborers including artists are the ones who will get screwed.

TheBurlapBandit,

Me, I’ll benefit the most. I’ve been using a locally running instance of the free and open source AI software Stable Diffusion to generate artwork for my D&D campaigns and they’ve never looked more beautiful!

FaceDeer,
FaceDeer avatar

Same here. It's awesome being able to effectively "commission" art for any random little thing the party might encounter. And sometimes while generating images there'll be surprising details that give me new ideas, too. It's like brainstorming with ChatGPT but in visual form.

zoostation, to technology in OpenAI wants to raise 5-7 trillion dollars. Yes, Trillion

Could that level of investment ever be recouped in any other manner than by replacing vast numbers of workers and their salaries?

RegalPotoo,
@RegalPotoo@lemmy.world avatar

That’s my question; presumably the people in charge of that much wealth aren’t total fools and will be wanting to see some actual numbers and a business case as to how they will see a return, not just platitudes and enthusiasm.

iopq,

That’s how productivity growth is achieved, a smaller amount of workers do the same task.

Or course, the created wealth is again invested back eventually and new products/services require new jobs.

For example, right now we have some of the highest labor participation in years, despite rising productivity

bunnyfc,
bunnyfc avatar

yeah and productivity increase has decoupled from wage in 1980, while productivity rises wages stay the same - why should anyone who's not a multimillionaire find that acceptable?

QuaternionsRock,

Why would anyone know this fact and attack productivity increases rather than its being decoupled from wages?

MxM111,
MxM111 avatar

Yes, think about how computers had multiplicative effect on productivity. The same may be possible with AI.

ikidd,
@ikidd@lemmy.world avatar

Well, see, if we grind down 8 billion people into a nourishing slurry with a shelf life of a century, that should be worth at least $1000 a person, with inflation. That’s a 50% profit on your investment!

remotelove, to technology in OpenAI wants to raise 5-7 trillion dollars. Yes, Trillion

That’s fairly bold to ask for ~6% of the total world economy as well as a sizable chunk of the world’s energy.

herrcaptain,

See now, they need to ask for more like 25% of the total world economy. That way what they actually want is gonna seem like a great deal.

essteeyou,

I was about to ask for the other 94%…

Tremble,

.000000001 percent of that…. Please

noodlejetski,

so… fifty bucks?

SmackemWittadic,
@SmackemWittadic@lemmy.world avatar

$100,000,000,000,000 × 0.00000000001 is $1,000!

I’d happily take that

noodlejetski,

ah, I was starting with the OpenAI’s goal, not the total world economy for some reason.

scarabic,

LOL yes I did the same math. Dude had one wish and he blew it!

“No please, Mr. Genie I meant .000000001 not .00000000001%!”

Genie: so $5,000? (Snaps fingers)

SatansMaggotyCumFart,

AI will double the world’s economy so they’re basically giving us an extra 94% for free.

topinambour_rex,
@topinambour_rex@lemmy.world avatar

That’s called inflation.

avidamoeba,
@avidamoeba@lemmy.ca avatar

The world economy cannot double without destroying the planet’s ability to sustain us.

Num10ck,

inagine it like putting the middle classes wealth in a blender.

nilloc,

But instead of a delicious smoothie, it makes that toxic dust like the will it blend videos.

PoliticallyIncorrect, (edited ) to technology in OpenAI wants to raise 5-7 trillion dollars. Yes, Trillion
@PoliticallyIncorrect@lemmy.world avatar

Better to get that money before the Ponzi fall down…

recapitated, to technology in OpenAI wants to raise 5-7 trillion dollars. Yes, Trillion

Weird that an Altman wants all this human to go to something alternative to humans.

carpelbridgesyndrome, to technology in OpenAI wants to raise 5-7 trillion dollars. Yes, Trillion

Scam Altman Freid strikes again

mindbleach, to technology in OpenAI wants to raise 5-7 trillion dollars. Yes, Trillion

And I want a pony.

_sideffect, to technology in OpenAI wants to raise 5-7 trillion dollars. Yes, Trillion

Ah yes, open “ai”…where we parse as many webaites as possible, rank them internally by keyword usage and or views, then have our “bot” spew that shit at you

mindbleach,

Stonks guy: Art fish Lint gents.

Blackmist, to technology in OpenAI wants to raise 5-7 trillion dollars. Yes, Trillion

At this point I wouldn’t mind them accidentally creating Skynet and killing us all in a nuclear inferno, just so we don’t have to listen to any more insufferable, grifting techbros.

Max Zorin had the right idea about silicon valley.

jay9,

“Accidentally”

A_Very_Big_Fan, to technology in OpenAI wants to raise 5-7 trillion dollars. Yes, Trillion

I, too, would like 5-7 trillion dollars

Buttons,
@Buttons@programming.dev avatar

Best I can do is 4 trillion, take it or leave it

BigBananaDealer,

you could donate billions per year and still have a trillion left by the time you died. fuck man what would you even do with that much money?

A_Very_Big_Fan,

Drugs!

I’d probably give Louis Rossman and the institute for justice a trillion each. Rossman would use it to lobby for the Right to Repair movement, and the IFJ would use it to get rid of blatantly unjust legislation like civil asset forfeiture.

Donating a billion to GDQ while they’re live would be pretty funny

long_chicken_boat,

rossman is a greedy influencer wannabe that wants to profit from internet injustices, stop being a fanboy.

A_Very_Big_Fan,

He works for a non-profit organization as a Right to Repair lobbyist, so idk what you’re on about. You’d already know this if you did any research beyond what you probably heard on Twitter.

If you think he’s “greedy” for using his AdSense and MacBook repair money to create tutorials/Wikis for circumventing anti-repair measures and spreading awareness about unethical tech practices, idk what to tell you.

mindbleach,

One dollar for every point in a Do DonPachi run.

Hell, have them bring back the week-long after-show, just for stupid challenges you made up. $5000 for every game PJ can go out-of-bounds in, in twenty minutes. Four-person race to blindfold Ornstein & Smough. Mitch beating every NES SMB game while holding his breath underwater.

At some point just fly everyone out to the old Nickelodeon Guts course - which you’ve purchased and restored with approximately 0% of your fortune - to have these exuberant dorks compete on entirely different nostalgia-bait. You could probably split them evenly into Team Cishet and Team Queer. Well, okay. Team Queer might need to spot them a few bisexuals, to keep team size balanced.

DeepGradientAscent,
@DeepGradientAscent@programming.dev avatar
nutsack, (edited ) to technology in OpenAI wants to raise 5-7 trillion dollars. Yes, Trillion

hell yeah feed the beast let’s all turn into weird little egg holder things

  • All
  • Subscribed
  • Moderated
  • Favorites
  • JUstTest
  • kavyap
  • DreamBathrooms
  • khanakhh
  • magazineikmin
  • InstantRegret
  • tacticalgear
  • thenastyranch
  • Youngstown
  • rosin
  • slotface
  • modclub
  • everett
  • ngwrru68w68
  • anitta
  • Durango
  • osvaldo12
  • normalnudes
  • cubers
  • ethstaker
  • mdbf
  • provamag3
  • GTA5RPClips
  • cisconetworking
  • Leos
  • tester
  • megavids
  • lostlight
  • All magazines