digdug

@digdug@kbin.social
digdug,

He's saying "crime pays," so the tax won't hurt them.

digdug,

Yes, but electricity has the potential to be replaced by non fossil fuel sources.

digdug,

Ooh, you beat mine: 848330 (unfortunately, mine was hacked over 5 years ago, and by the time I realized, their logs didn't go back far enough for me to validate that I was the original owner)

digdug, (edited )

I just tried this a couple different ways:

  1. Removing permission for "nearby devices" - this unfortunately appears to block both Bluetooth and NFC permission
  2. Turning off the phone's Bluetooth - NFC still works while the Bluetooth radio is off, but you'd basically never be able to safely use Bluetooth anytime you aren't watching your car. Setting a PIN is still unfortunately the only way to go, and hope that a dedicated attacker doesn't also find a way to capture your PIN (e.g. camera zoomed in on your screen).
digdug, (edited )

I don't believe you're fully arguing in good faith here.

I'm assuming you've seen a naked adult, and if you had never seen a naked young person, I don't believe for one second you would be unable to infer what a naked young person might look like. You might not know for certain, but your best guess would likely be very accurate.

Generative AI can absolutely make those same inferences, so it does not need inappropriate training material for it to generate it.

The AI knows what a young person looks like.
It knows what a clothed adult looks like.
It knows what an unclothed adult looks like.

An AI trained on 100% legal material could make that inappropriate inference without even trying.

Now, have all the popular AI models actually been trained on 100% legal material? I have no way of knowing that answer, but you're incorrect to assume that just because it can output inappropriate images, that absolutely 100% proves that data was also included in its training input. Edit: nevermind, it definitely has been trained on inappropriate material, but that doesn't disprove that it doesn't need to be.

digdug, (edited )

In this hypothetical, the AI would be trained on fully clothed adults and children. As well as what many of those same adults look like unclothed. It might not get things completely right on its initial attempt, but with some minor prompting it should be able to get pretty close. That said, the AI will know the correct head size proportions from just the clothed datasets. It could probably even infer limb proportions from the clothed datasets as well.

digdug,

I think there are two arguments going on here, though

  1. It doesn't need to be trained on that data to produce it
  2. It was actually trained on that data.

Most people arguing point 1 would be willing concede point 2, especially since you linked evidence of it.

digdug,

This is the part of the conversation where I have to admit that you could be right, but I don't know enough to say one way or the other. And since I have no plans to become a pediatrician, I don't intend to go find out.

digdug,

You could set up btrfs snapshots, too. Of course, don't forget to take a snapshot before you break your configs.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • JUstTest
  • cisconetworking
  • DreamBathrooms
  • InstantRegret
  • ethstaker
  • magazineikmin
  • Youngstown
  • thenastyranch
  • mdbf
  • slotface
  • rosin
  • modclub
  • kavyap
  • GTA5RPClips
  • provamag3
  • osvaldo12
  • khanakhh
  • cubers
  • Durango
  • everett
  • ngwrru68w68
  • tester
  • normalnudes
  • tacticalgear
  • anitta
  • megavids
  • Leos
  • lostlight
  • All magazines