@FaceDeer@fedia.io
@FaceDeer@fedia.io avatar

FaceDeer

@FaceDeer@fedia.io

Basically a deer with a human face. Despite probably being some sort of magical nature spirit, his interests are primarily in technology and politics and science fiction.

Spent many years on Reddit and then some time on kbin.social.

This profile is from a federated server and may be incomplete. Browse more on the original instance.

FaceDeer,
@FaceDeer@fedia.io avatar

Camera-makers, too. And people who make pencils. Lock the whole lot up, the sickos.

FaceDeer,
@FaceDeer@fedia.io avatar

If AI has the means to generate inappropriate material, then that means the developers have allowed it to train from inappropriate material.

That's not how generative AI works. It's capable of creating images that include novel elements that weren't in the training set.

Go ahead and ask one to generate a bonkers image description that doesn't exist in its training data and there's a good chance it'll be able to make one for you. The classic example is an "avocado chair", which an early image generator was able to produce many plausible images of despite only having been trained on images of avocados and chairs. It understood the two general concepts and was able to figure out how to meld them into a common depiction.

FaceDeer,
@FaceDeer@fedia.io avatar

Image AIs also don't act or respond on their own. You have to prompt them.

FaceDeer,
@FaceDeer@fedia.io avatar

The trainers didn't train the image generator on images of Mr. Bean hugging Pennywise, and yet it's able to generate images of Mr. Bean hugging Pennywise. Yet you insist that it can't generate inappropriate images without having been specifically trained on inappropriate images? Why is that suddenly different?

FaceDeer,
@FaceDeer@fedia.io avatar

No, you keep repeating this but it remains untrue no matter how many times you say it. An image generator is able to create novel images that are not directly taken from its training data. That's the whole point of image AIs.

FaceDeer,
@FaceDeer@fedia.io avatar

The person who was charged was using Stable Diffusion to generate the images on their own computer, entirely with their own resources. So it's akin to a company that sells 3D printers selling a printer to someone, who then uses it to build a gun.

FaceDeer,
@FaceDeer@fedia.io avatar

Better a dozen innocent men go to prison than one guilty man go free?

FaceDeer,
@FaceDeer@fedia.io avatar

First, you need to figure out exactly what it is that the "blame" is for.

If the problem is the abuse of children, well, none of that actually happened in this case so there's no blame to begin with.

If the problem is possession of CSAM, then that's on the guy who generated them since they didn't exist at any point before then. The trainers wouldn't have needed to have any of that in the training set so if you want to blame them you're going to need to do a completely separate investigation into that, the ability of the AI to generate images like that doesn't prove anything.

If the problem is the creation of CSAM, then again, it's the guy who generated them.

If it's the provision of general-purpose art tools that were later used to create CSAM, then sure, the AI trainers are in trouble. As are the camera makers and the pencil makers, as I mentioned sarcastically in my first comment.

FaceDeer,
@FaceDeer@fedia.io avatar

You suggested a situation where "many people would get off charges of real CSAM because the prosecuter can't prove that it wasn't AI generated." That implies that in that situation AI-generated CSAM is legal. If it's not legal then what does it matter if it's AI-generated or not?

FaceDeer,
@FaceDeer@fedia.io avatar

You realize that there are perfectly legal photographs of female genitals out there? I've heard it's actually a rather popular photography subject on the Internet.

Do you see where I'm going with this? AI only knows what people allow it to learn...

Yes, but the point here is that the AI doesn't need to learn from any actually illegal images. You can train it on perfectly legal images of adults in pornographic situations, and also perfectly legal images of children in non-pornographic situations, and then when you ask it to generate child porn it has all the concepts it needs to generate novel images of child porn for you. The fact that it's capable of that does not in any way imply that the trainers fed it child porn in the training set, or had any intention of it being used in that specific way.

As others have analogized in this thread, if you murder someone with a hammer that doesn't make the people who manufactured the hammer guilty of anything. Hammers are perfectly legal. It's how you used it that is illegal.

FaceDeer,
@FaceDeer@fedia.io avatar

Whereas I'm enjoying many of the new AI-powered features that Microsoft has been coming up with lately.

But echo chambers gonna echo, I guess.

FaceDeer,
@FaceDeer@fedia.io avatar

Check the upvote/downvote counts on my comment vs. macattack's. It's nigh impossible to say anything positive about AI around here.

FaceDeer,
@FaceDeer@fedia.io avatar

No, I sound like someone who likes many of the new AI-powered features that Microsoft has been coming up with lately.

I don't use Linux. I don't think about it at all, it doesn't affect me.

FaceDeer,
@FaceDeer@fedia.io avatar

The state of the art for small models is improving quite dramatically quite quickly. Microsoft just released the phi-3 model family under the MIT license, I haven't played with them myself yet but the comments are very positive.

Alternately, just turn that feature off.

FaceDeer,
@FaceDeer@fedia.io avatar

But then you get that awkward situation where you go on vacation, open your luggage to get a fresh pair of socks or whatever, and find that you brought nothing but guns and ammo along with you on your trip.

FaceDeer,
@FaceDeer@fedia.io avatar

It is entirely possible for both sides of a conflict to be committing war crimes.

FaceDeer,
@FaceDeer@fedia.io avatar

Yeah, sadly. I've been reluctant to mention I'm not liking the current season because when I didn't like the previous season that made me a sexist. Now if I don't like the new season I'll be a racist too.

FaceDeer,
@FaceDeer@fedia.io avatar

I'll never understand why all the TV producers think they can get away with cutting all the corners on writing.

My previous comment shows one possible reason - I said I didn't like one of her seasons and now I'm getting downvoted. There's a ready-made excuse to sling at anyone who criticizes the show.

I was actually excited to see a female doctor when it was announced, but I'm not going to like a show simply because of the gender or race or whatever of the actors. I saw that the quality of the writing was bad and so I left. The new season hasn't exactly enticed me back.

FaceDeer,
@FaceDeer@fedia.io avatar

Aren't there lots of folks in the comments crowing about how "the embargo is working" and such? If Americans are taking credit for it then it seems quite reasonable to blame them for it.

FaceDeer,
@FaceDeer@fedia.io avatar

The only way I can imagine this working is by twisting the definition of the words "search engine" enough that you can claim that there aren't search engines, but really there are still, just under a different name.

Search engines aren't actually the "problem" that OP is wanting to address, here, though. He just doesn't like the specific search engines that actually exist right now. What he should really be asking is how a search engine could be implemented that doesn't have the particular flaws that he's bothered by.

FaceDeer,
@FaceDeer@fedia.io avatar

I once had someone respond with astonishment; "but you're such a good person!" When they found out.

Thanks, I guess?

FaceDeer,
@FaceDeer@fedia.io avatar

With comments like this he likely goes through new accounts on a very rapid pace.

FaceDeer,
@FaceDeer@fedia.io avatar

Anyone can download a torrent containing historical Reddit comments, Reddit surely has at least that if not a full edit/delete history of all the comments. The only people you are thwarting by deleting your comments are other humans who may stumble across your old threads in Google.

FaceDeer,
@FaceDeer@fedia.io avatar

And also they're posting about it on a completely open platform that any AI trainer could trivially be "harvesting" as well.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • JUstTest
  • kavyap
  • DreamBathrooms
  • thenastyranch
  • magazineikmin
  • tacticalgear
  • cubers
  • Youngstown
  • mdbf
  • slotface
  • rosin
  • osvaldo12
  • ngwrru68w68
  • GTA5RPClips
  • provamag3
  • InstantRegret
  • everett
  • Durango
  • cisconetworking
  • khanakhh
  • ethstaker
  • tester
  • anitta
  • Leos
  • normalnudes
  • modclub
  • megavids
  • lostlight
  • All magazines