skullgiver, (edited )
@skullgiver@popplesburger.hilciferous.nl avatar

deleted_by_author

  • Loading...
  • Drusas,

    Humans can't entirely replicate one another's voices. I recognize voices far better than faces, and I know I'm not the only one out there who does so. There are a lot of good imitators out there, but they can't replicate another voice.

    Rossel,

    The legal grounds are that the AI is trained using voice lines that can indeed be copyrighted material. Not the voice itself, but the delivered lines.

    magic_lobster_party,

    Making derivatives of existing game assets is a core part of modding. I don’t see how this is any different from splicing existing voice lines to make them say whatever you want them to say.

    Maybe it’s morally wrong to use the work of voice actors for NSFW purposes without their consent, but I’m not sure if it’s illegal from a copyright standpoint.

    FaceDeer,
    FaceDeer avatar

    The problem with that approach is that the resulting AI doesn't contain any identifiable "copies" of the material that was used to train it. No copying, no copyright. The AI model is not a legally recognizable derivative work.

    If the future output of the model that happens to sound very similar to the original voice actor counts as a copyright violation, then human sound-alikes and impersonators would also be in violation and things become a huge mess.

    bedrooms, (edited )

    I'm finally reading a comment from someone who actually knows how machine learning works. Too many people craft their argument before learning about the technology. Well, they think reading a few blog articles counts as research maybe.

    IncognitoErgoSum,

    Unfortunately, the courts and legislatures may craft their opinions and laws, respectively, without knowing how machine learning actually works.

    ChemicalRascal,
    ChemicalRascal avatar

    The problem with that approach is that the resulting AI doesn't contain any identifiable "copies" of the material that was used to train it. No copying, no copyright. The AI model is not a legally recognizable derivative work.

    That's a HUGE assumption you've made, and certainly not something that has been tested in court, let alone found to be true.

    In the context of existing legal precedent, there's an argument to be made that the resulting model is itself a derivative work of the copyright-protected works, even if it does not literally contain an identifiable copy, as it is a derivative of the work in the common meaning of the term.

    If the future output of the model that happens to sound very similar to the original voice actor counts as a copyright violation, then human sound-alikes and impersonators would also be in violation and things become a huge mess.

    A key distinction here is that a human brain is not a work, and in that sense, a human brain learning things is not a derivative work.

    LoafyLemon,
    LoafyLemon avatar

    In that case all work would be derivative.

    ChemicalRascal,
    ChemicalRascal avatar

    No? No. Not all work is analogous to training a generative model. That's a really bizarre thing to say, and I'm shocked to hear it from you.

    FaceDeer,
    FaceDeer avatar

    That's a HUGE assumption you've made

    No, I know how these neural nets are trained and how they're structured. They really don't contain any identifiable copies of the material used to train it.

    and certainly not something that has been tested in court

    Sure, this is brand new tech. It takes time for the court cases to churn their way through the system. If that's going to be the ultimate arbiter, though, then what's to discuss in the meantime?

    IncognitoErgoSum,

    Also, neural network weights are just a bunch of numbers, and I'm pretty sure data can't be copyrighted. And yes, images and sounds and video stored on a computer are numbers too, but those can be played back or viewed by a human in a meaningful way, and as such represent a work.

    ChemicalRascal,
    ChemicalRascal avatar

    Also, neural network weights are just a bunch of numbers, and I'm pretty sure data can't be copyrighted.

    Just being "a bunch of numbers" doesn't stop it from being a work, it doesn't stop it from being a derivative work, and you absolutely can copyright data -- all digitally encoded works are "just data".

    A trained AI is not a measurement of the natural world. It is a thing that has been created from the processing of other things -- in the common sense of it the word, it is derivative of those works. What remains, IMO, is the question of if it would be a work, or something else, and if that something else would be distinct enough from being a work to matter.

    IncognitoErgoSum,

    Just being "a bunch of numbers" doesn't stop it from being a work, it doesn't stop it from being a derivative work

    I suggest reading my entire comment.

    A trained AI is not a measurement of the natural world. It is a thing that has been created from the processing of other things -- in the common sense of it the word, it is derivative of those works. What remains, IMO, is the question of if it would be a work, or something else, and if that something else would be distinct enough from being a work to matter.

    It's only a work if your brain is a work. We agree that in a digitized picture, those numbers represent the picture itself and thus constitute a work (which you would have known if you read beyond the first sentence of my comment). The weights that make up a neural network represent encodings into neurons, and as such should be treated the same way as neural encodings in a brain.

    ChemicalRascal,
    ChemicalRascal avatar

    I suggest reading my entire comment.

    I did, buddy. You're just wrong. You can copyright data. A work can be "just data". Again, we're not talking about a set of measurements of the natural world.

    It's only a work if your brain is a work. (...) The weights that make up a neural network represent encodings into neurons, and as such should be treated the same way as neural encodings in a brain.

    Okay, I see how you have the hot take that a generative model is brain-like to you, but that's a hot take -- it's not a legally accepted fact that a trained model is not a work.

    You understand that, right? You do get that this hasn't been debated in court, and what you think is correct is not necessarily how the legal system will rule on the matter, yeah?

    Because the argument that a trained generative model is a work is also pretty coherent. It's a thing that you can distribute, even monetise. It isn't a person, it isn't an intelligence, it's essentially part of a program, and it's the output of labour performed by someone.

    The fact that something models neurons does not mean it can't be a work. That's not... coherent. You've jumped from A to Z and your argument to get there is "human brain has neurons". Like, okay? Does that somehow mean anything that is vaguely neuron-like is not a work? So if I make a mechanical neuron, I can't copyright it? I can't patent it?

    No, that's absurd.

    ChemicalRascal,
    ChemicalRascal avatar

    No, I know how these neural nets are trained and how they're structured. They really don't contain any identifiable copies of the material used to train it.

    Go back and read my comment in full, please. I addressed that directly.

    chickenwing,

    I think this could technically fall under name, image, likeness rights. Your voice could count as likeness and can’t be used without your concent. I know the big movie studios need to get permission from the family to do their creepy dead actor cameos like Disney did with Peter Cushing. I think this would be the same.

    Tigbitties,
    Tigbitties avatar

    Frito lay copied Tom Waits voice and he sued. He won. I think the judge even got him more than he asked.

    T156,

    Would we see the same outrage if the voices in these mods were just people that sounded like the original voice actors?

    Part of the issue might also be that the voice is being reproduced by a machine.

    If it’s someone else pretending to be them or a character that they’ve voiced, that’s fine, since it’s fundamentally someone else’s voice. By comparison, having an AI model do it removes that degree of separation, as it’s effectively an edited version of their voice being used in a way that they never signed up for.

    It’s unclear whether the same issue would arise if it was a piece of regular software that could cleverly edit an existing recording, and work that way, but I imagine that it would. It’s just that editing existing recordings is both a lot more work, and leaves obvious traces, whereas a voice model can be a bit more seamless.

    Could the voice actors suffer damage to their reputation if someone takes a model of their voice and ties them to a role/character/production that they didn’t want to be associated with?

    Creat,

    Technologically it doesn’t just magically come up with the voice you want using AI. It uses training data, which you need explicit permission (even using existing copyright law) to use as the basis of derivative work. It just happens without a human manually stitching it together. It’s now also capable of much more, and the lines definitely get blurrier for some things, but this at least seems pretty clear cut.

    If a human imitates a voice that is still his creation. It’s even it’s own art form, arguably.

    Even if you could manage to create that voice using AI with only publicly available recordings, it likely highly friends where you are in the world. There are laws that give you the right to your own image, so just cause someone took a picture of you doesn’t mean he can do anything with it without your permission. It’s possible that this would extend to recordings of your voice in a similar way, if taken to court.

    TheChurn,

    The porn bit gets headlines, but it isn't the core of the issue.

    All of these models retain a representation of the original training data in their parameters, which makes training a violation of copyright unless it was explicitly authorized. The law just hasn't caught up yet, since it is easy to obfuscate this fact with model mumbo-jumbo in between feeding in voices and generating arbitrary output.

    The big AI players are betting that they will be able to entrench themselves with a massive data advantage before regulation locks down training and effectively kills any future competition. They will already have their models, and the worst case at that point is paying some royalties to people whose data was used in training.

    LoafyLemon,
    LoafyLemon avatar

    I'd like to know how do you expect the governments or even private institutions to enforce this since most of the countries won't care about foreign laws.

    Ragnell,
    Ragnell avatar

    They can forbid companies from using the AI to do business in their areas, like the EU is doing with privacy laws. Google not being able to use its chatbot search in the US would be a big deal.

    LoafyLemon,
    LoafyLemon avatar

    Sounds to me you would have to prove someone used AI in their work first, therefore making it difficult to realistically enforce.

    Ragnell,
    Ragnell avatar

    Not hard when they're advertising it right now. And if they do try to keep it secret all the government will have to do is subpoena a look at the backend.

    But honestly, since when do we just not have laws when it's hard to prove. It's hard to prove someone INTENDS to murder someone, but that's a really important legal distinction. It's hard to prove someone's faking a mental illness, but that's another thing that's got laws around it. It's really hard to prove sexual assault, but that needs to still be outlawed too.

    Compared to that stuff? Proving someone used an AI is going to be a piece of cake with all the data that gets collected and the amount of work it would take to REMOVE the AI from a business process before the cops get there.

    LoafyLemon,
    LoafyLemon avatar

    Enforcing a potential AI ban in work environments is unrealistic right now because it's challenging to prove that AI was actually used for work purposes and then enforce such a ban. Let's break it down in simple terms.

    Firstly, proving that AI was used for work is not straightforward. Unlike physical objects or traditional software, AI systems often operate behind the scenes, making it difficult to detect their presence or quantify their impact. It's like trying to catch an invisible culprit without any clear evidence.

    Secondly, even if someone suspects AI involvement, gathering concrete proof can be tricky. AI technologies leave less visible traces compared to conventional tools or processes. It's akin to solving a mystery where the clues are scattered and cryptic.

    Assuming one manages to establish AI usage, the next hurdle is enforcing the ban effectively. AI systems are often complex and interconnected, making it challenging to untangle their influence from the overall work environment. It's like trying to remove a specific ingredient from a dish without affecting its overall taste or texture.

    Moreover, AI can sometimes operate subtly or indirectly, making it difficult to draw clear boundaries for enforcement. It's like dealing with a sneaky rule-breaker who knows how to skirt around the regulations, all you have to do is ask.

    Considering these challenges, implementing a ban on AI in work environments becomes an uphill battle. It's not as simple as flipping a switch or putting up a sign. Instead, it requires navigating through a maze of complexity and uncertainty, which is no easy task.

    bedrooms,

    Are you sure? Training with VA data counts as privacy violation?

    Ragnell,
    Ragnell avatar

    No, that was just an example of how such a law may be enforced.

    Honestly, we may need to get our lawmakers to expand the definition of "identity theft" to really cover this stuff. A VA's voice is their innate, distinctive personal signature. It is their livelihood. We'll have to see what a court says about copyright but it's certainly not right for it to be just copied by an AI.

    stopthatgirl7,
    stopthatgirl7 avatar

    Thing is, the courts, in the US at least, have already made a decision about this.

    Bette Midler v. Ford Motor Co

    Impersonation of a voice requires permission from the original artist. AI is no different. It should require permission.

    bedrooms,

    All of these models retain a representation of the original training data in their parameters, which makes training a violation of copyright unless it was explicitly authorized.

    How can you argue it's copyright violation while also saying the law hasn't caught up?

    The law just hasn't caught up yet

    Dr_Cog,
    @Dr_Cog@mander.xyz avatar

    It’s a violation of the spirit of the law, just not the letter (yet). It’s a good bet that it will be updated soon, especially when you consider that copyright laws get regularly updated to protect the IP of large corporations

    reclipse,
    @reclipse@lemdro.id avatar

    All of these models retain a representation of the original training data in their parameters, which makes training a violation of copyright unless it was explicitly authorized

    I don’t think it is realistic to train a AI model by specific authorised database.

    argv_minus_one,

    Sigh. I was kind of hoping that this would make for much more interesting game mods, but of course people had to use it for pornographic game mods and ruin the whole concept.

    lemonflavoured,
    lemonflavoured avatar

    I feel like at some point this is going to lead to a very clunky ban on using AI for voices, which specifically exempts human impressionists.

    Madison_rogue,
    Madison_rogue avatar

    Joan is Horrible...

    Absolutely Black Mirror territory here. I do hope that game studios work on stopping this.

    FaceDeer,
    FaceDeer avatar

    I don't think it's a good idea to have it be illegal to sound similar to another person.

    TheDankHold,

    Everyone has takes like this like we can’t just draw the line at making it illegal for a computer or algorithm to do this.

    FaceDeer,
    FaceDeer avatar

    We can, but why should we? The end result is the same.

    TheDankHold,

    It’s not though. That’s you naively ignoring the aspects that are different. Computer generated imitation is easier to create, can be created at a scale far eclipsing human action, and can be finely tuned to make it harder to discern.

    You can look up impressionists and you’ll find it’s a rather small club when looking for the true greats. Computers remove this barrier and allow any asshole with an internet connection to create a video of you screaming racial epithets if your voice is easy enough to access.

    The vast difference in scale can’t be ignored.

    FaceDeer,
    FaceDeer avatar

    Talking up the capabilities of AI voice acting is not really helping the case against it. If it's really so good, and laws are enacted that forbid mixing human and AI voice acting, then I expect the straightforward optimal solution would be to entirely eliminate the human voice actors going forward.

    TheDankHold,

    You write laws for the future. Unless you think ai generated content has plateaued. Which is again, naive. Just because social media wasn’t popular at first doesn’t mean we should’ve waited on passing data privacy legislation like we have. It’s good to identify potential issues and attempt to mitigate them early. So we don’t get situations like our current climate status.

    FaceDeer,
    FaceDeer avatar

    I don't think it's plateaued, I think it's going to get significantly better from here onward.

    I'm not sure what laws you're proposing at this point. Are you suggesting that AI should be forbidden from "mimicking" a human voice actor? That's what I'm suggesting will lead even more quickly to AI-only projects that get rid of the human voice actors entirely, since having a human voice actor under laws like that would end up as a huge hindrance.

    JoshicShin,
    JoshicShin avatar

    I think you are on the right track with this. Reminds me of the tales of people smashing automatic looming devices with the early luddites.

    TheDankHold,

    Ai voice impressions should be rendered illegal without explicit consent from the entity being imitated. Simple.

    Also your extrapolation of potential events feels ridiculous. Tech is banned so it’s used more in commercial projects heavily subject to such legislation?

    FaceDeer,
    FaceDeer avatar

    If impressions are banned, ie voices that sound like existing human voice actors, then yes, I expect it would be used a lot more. Because the only safe way to use AI voices in that case would be to never have a human actor to begin with. Create a novel AI voice from scratch and use that for your character, and then you can freely generate new lines with no further legal or practical concerns.

    Whereas if you were to use a human voice actor for a character, you're stuck with that human voice actor. You can't do a quick virtual re-shoot without hauling him in for it, there's royalties for everything, and if the human voice actor dies or spouts off some unfortunate racist rants on social media or simply quits then you're screwed.

    Unless you're proposing banning all AI voices completely, including novel ones that were never imitating a specific human to begin with? That's rather the more ridiculous scenario.

    TheDankHold,

    It’s like you just skipped past the “with consent” caveat to go on your diatribe.

    So now that I’ve reiterated that it wouldn’t be entirely illegal, can you explain how that requirement will cause there to be no more VAs? In this world you’re imagining, how are these computational models creating voices? You talk about a “safe way to use” it in these circumstances but again, these companies still need data to generate voices. And this data would be protected through active seeking of consent.

    You aren’t “[creating] a novel voice from scratch, that’s just not how the technology works. It needs a human to extract data from and compile something intelligible. Unless you want every animated feature to use the robotic assistant voices. Another aspect to why your perspective makes no sense and seemingly shows a complete lack of understanding of how these computational models work and how that comes together with my proposition.

    Then your last paragraph is just a confirmation that no, you haven’t fully read what I wrote. So one more time with gusto:

    Using computer generation to imitate a person using their own biometric data should be illegal unless explicit consent is given.

    IncognitoErgoSum,

    The end result is going to be basically the same regardless. Plenty of people (such as myself) who believe in the huge potential of AI to give creative power to regular people will volunteer our voices. Giving that creative power to everyone is worth far more, in my opinion, than gatekeeping the creation of art.

    Unless they're planning on making it illegal for a computer to imitate any human voice, I don't see where making a law against using a voice without consent would make a big substantive difference. Just re-voice the existing lines in Skyrim with new voices to maintain consistency and you're good (there's a Serana mod that already does this, for instance).

    TheDankHold,

    If you’re using new voices then congrats. That’s not the issue. The issue is people using existing voices to create audio the person didn’t consent to record. You gave an example of it being done right. Not to mention I’m pretty sure the Serena dialogue isn’t ai generated so it’s not even close to relevant in actuality.

    But since you seem to love the potential of AI would you be willing to send me an audio file of you pronouncing every possible phonetic sound the human mouth can make? I promise there won’t be audio of you talking about eating babies afterward because as you say, there’s no practical reason to require consent for these things. No one could possibly abuse technology to hurt other people. It’s never happened in history.

    AI is indeed a powerful tool that can be used to let more people explore their creativity. Your assumption that I felt otherwise is because you’re on the opposite end of the spectrum. So self assured of it’s value that you’re blind to real shortcomings and abusable points.

    I would describe my position more like: AI, like any new technology, is neutral. It’s usable for good and for bad. Thus it’s important to watch for ethical pitfalls that we may not have had to consider before due to the drastic way the new technology impacts society.

    IncognitoErgoSum,

    But since you seem to love the potential of AI would you be willing to send me an audio file of you pronouncing every possible phonetic sound the human mouth can make?

    In theory, absolutely.

    In practice, I'm not going to go through that much work just to make a point for a single fediverse comment. I'll be honest, though -- I'm not particularly worried about somebody using my voice to do a bad (or do a racism or whatever). It may happen, and I can live with it; I think the benefits far outweigh the cost, and in my experience, far more people use those sorts of things to do awesome stuff than to be shitty. Earlier today I was considering trying to put together an Open Voice project and collect volunteers to do exactly what you said.

    I've already released open source code over the years; people could potentially use that to do things I don't agree with as well, but frankly, as someone who has had work out in the wild available for use by everyone, the panic is vastly overblown.

    Your assumption that I felt otherwise is because you’re on the opposite end of the spectrum. So self assured of it’s value that you’re blind to real shortcomings and abusable points.

    Just because I feel that the potential benefits far outweigh the costs (as well as the draconian technical restrictions that would be required in order to prevent people from expressing themselves in a bad way), it doesn't follow that I'm somehow blind to the real shortcomings and abusable points of AI. I would appreciate if you not make silly strawman assumptions about whether I've given something due consideration just because you don't like my conclusions.

    If you have a solution that wouldn't absolutely kill it (or put a horribly filtered version in the hands of a few massive corporations who charge the rest of us for the privilege of using it while using it themselves however they want), I'm all ears.

    FaceDeer,
    FaceDeer avatar

    You aren’t “[creating] a novel voice from scratch, that’s just not how the technology works. It needs a human to extract data from and compile something intelligible.

    Much like with art AIs, the outputs don't necessarily have to slavishly mimic the style of any of the inputs. Train an AI with a bunch of different voices and then you can get it to generate a novel voice that isn't a copy of any specific one that it was trained on.

    Using computer generation to imitate a person using their own biometric data should be illegal unless explicit consent is given.

    This doesn't affect what I've said. If imitating a specific human comes with a bunch of annoying legal and economic hassles, then don't imitate a specific human. Create a novel voice and you're free of all of that.

    And yes, the technology lets you create a novel voice different from any of the ones it was trained on. I do know how these things work.

    TheDankHold,

    You should not be able to use someone else’s biometrics (voice is one) to generate content without their consent. Your example would be similarly illegal because it is unethically using a persons personal data without their consent for commercial or other purposes.

    It’s “novel” in that it’s an approximation of all its input data, tweaked to match the specifics of the request given. It still needs to use the data of real people or it can’t create anything. You have a surface level understanding if you don’t understand the importance of that seeding data.

    You don’t seem to value consent when it comes to the systematic harvesting of personal data for another persons benefit. I’ve been very clear that the issue is the lack of consent combined with current and future capabilities of the technology.

    valpackett,
    @valpackett@lemmy.blahaj.zone avatar

    How would one even prove that something actually was indeed created with this kind of synthesis?

    fearout, (edited )
    fearout avatar

    Who’s Joan? My Netflix only has a Madison_rogue is awful episode.

    But in all seriousness, the tech is here and it’s here to stay. Legal stuff can curb it for a bit, but you know internet. It’s only going to grow.

    There should be some other way to deal with it besides “Forbid everything now!”.

    I’ve seen several future copyright systems being discussed which might provide a better way of doing things going forward. Like the one where all copyright is basically waived, but royalties are collected and distributed to all participants from any derivative work. Want to make a star wars fan movie? You’re free to do so (and everyone is), but every actor you use and every writer whose part of the lore you’re using gets a small cut from whatever to earn on it (so non-profit derivative work doesn’t require paying royalties, similar to parodies in the US).

    Kinda like sampling in music currently works. Could be a better system than what we currently have, with corporations owning hundreds of years of usage for one story or likeness.

    vlad76,

    Damn it. All good things have to be ruined by creepy people.

    ThunderingJerboa,
    ThunderingJerboa avatar

    It is a can of worms we were going to have to figure out either way. The power of these tools are impressive but they can get a little out of hand as shown with fools making NSFW mods with them but that is only one aspect. We have to keep in mind we can see companies screwing VAs out of jobs because of this tech as well which will likely have far more importance. In a way this tech can also be liberating for bad actors since we are now entering into a new phase of "Post truth" reality or I guess it may be better to say the democratization of fake news/false narratives, we were already at the door of this era but these AIs put us way past that door now. Its rather unfortunate since the power of this stuff can lead to some insanely cool concepts. Like the idea that fans could in theory restore cut content from games and even be able to replicate voice acting to make it feel closer to the original game. Hell how about inserting voice acting in games that never had it or were nonviable to have voice acting for every character in things like Baldur's gate, neverwinter nights, planescape torment, original fallouts, etc. Many of these games are beloved classics but they are sometimes missing some of the things we are used to in more modern games.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • tech
  • DreamBathrooms
  • thenastyranch
  • ngwrru68w68
  • magazineikmin
  • khanakhh
  • rosin
  • mdbf
  • Youngstown
  • slotface
  • everett
  • GTA5RPClips
  • kavyap
  • Leos
  • InstantRegret
  • JUstTest
  • Durango
  • osvaldo12
  • ethstaker
  • cubers
  • tacticalgear
  • tester
  • provamag3
  • cisconetworking
  • modclub
  • anitta
  • megavids
  • normalnudes
  • lostlight
  • All magazines