BreakDecks

@BreakDecks@lemmy.ml

This profile is from a federated server and may be incomplete. Browse more on the original instance.

BreakDecks,

Disposable vapes put more lithium into landfills than EVs. Everyone throws their vape in the trash, nobody throws their EV battery module in the trash

BreakDecks,

Uh, they’re being sued for over $200 Billion. What do you expect them to do, not fight it?

They’ve already shut down the “Emergency Library” they are being sued over, but the plaintiffs aren’t dropping the suit.

If losing this lawsuit destroys the IA, you should want them to fight like hell to win.

BreakDecks,

The emergency library followed the same legal framework that ebook lending follows at local libraries.

A library owns x many copies of a book, and they remove some percentage of them from circulation so that they can leverage them to lend digital copies (usually via Libby).

All IA did was coordinate with libraries that were closed due to COVID to allocate a portion of their uncirculated books for IA’s lending system. It was never uncapped, and even used DRM to protect against piracy like Libby does.

Every book that was lended had a physical copy deliberately uncirculated for the purpose of allowing redistribution. It was entirely legitimate, and I commend them for doing it.

Publishers are already trying to fight against libraries that they feel threaten their profitability. This attack against IA is just a test case for going after local libraries, and Libby next. I want IA to fight this and win, because we’re fucked on multiple levels of they lose.

Don’t blame IA for fulfilling their mission to make knowledge free. Blame capitalists for attacking libraries in an attempt to make knowledge less free.

BreakDecks,

especially if we all do our small part…

Yeah, I’m done with this article. If you want to make me feel less doomed, start blaming the corporations responsible and demanding actual action from the top down.

If you’re opening with some “individual responsibility” bullshit, you’re contributing to the apathy problem, because fuck you I can’t fix this.

BreakDecks,

At least Germany and Austria can deal with you when you try to spread it. If that feels authoritarian to you, good. Cry about it.

BreakDecks,

We could just make speech that directly glorifies or encourages violence illegal. Basically just an extension of laws against violent threats or speech that does direct harm.

It would be much harder to abuse such a restriction on speech, and it would cover all violent or genocidal movements without singling out a specific ideology.

One of the best ways around the first amendment is to throw away any notion of offensiveness or obscenity, and focus on victims. You can’t fraudulently shout “Fire!” in a crowded theatre because the people who get injured in the panic will be victims. You can’t possess child porn because minors have no right to grant sexual consent and the existence of such media further victimizes them. You can’t distribute private porn made with your partner without their permission for similar reasons: because your partner didn’t consent to it, and would be victimized by its distribution. You cannot threaten to assault or kill someone, or call for others to assault or kill someone, because it amounts to conspiracy to commit a violent crime.

None of these restrictions on speech focus on the speech being offensive, they focus on the speech having direct harmful consequences to others. It draws the line of where your rights end at where others’ rights begin.

I don’t see why we can’t extend this to cover inherently violent ideologies like Nazism, where glorification of it, by definition, is a call for violence against specific members of our society. If you celebrate the Holocaust and demand similar action today, the victims are the people who you are asking the state to murder. There’s no reason to tolerate that as free speech, because the people being targeted have the right not to be murdered. If it’s illegal to call for the murder of 1 person, why not make it illegal to call for the murder of millions of people?

BreakDecks,

I would rather they have to do it in the shadows, and be able to be dealt with if they show their faces in the light of day.

BreakDecks,

Because people use Twitch on Linux with a VPN? This seems kinda obvious…

BreakDecks,

“Sky Priority” is just a Delta thing, on par with any Airline loyalty program perk. You’re not getting airlines to give up their perks so easily, and I don’t think there’s a compelling reason to target those programs.

Pre-Check requires a background check and fingerprinting, and actually serves to make TSA’s job easier by routing some percentage of eligible travelers through a checkpoint with more relaxed security processes.

Clear is a bit different in that it is a private for-profit program that primarily grants users a fast-track through a mandatory government process. That’s a bit unethical. Though I do enjoy it for skipping the line at sporting events…

Instagram Advertises Nonconsensual AI Nude Apps (www.404media.co)

Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or...

BreakDecks,

The draw to these apps is that the user can exploit anyone they want. It’s not really about sex, it’s about power.

BreakDecks,

This is such a dumb argument. Nobody is claiming that the AI can show you what’s actually beneath a person’s clothes. The nudes being fake doesn’t resolve the ethical issue of creating porn of people who never agreed to it.

The people doing mental gymnastics about this stuff are just telling on themselves. Don’t make fake porn of real people, and if you do, be prepared to be rightfully treated as a sexual predator if anyone finds out.

BreakDecks,

You seem to have reading comprehension issues. They said that this could be done to their body, which is 100% true.

Any picture of anyone can be processed with an AI, and “nudified”. Yes, the AI generated portions of the image are fake, and likely won’t resemble the person’s actual body under their clothing. Results are probably more accurate for photos of people in swimsuits vs more conservative outfits but…

…that doesn’t matter. If you’re modifying a picture of a real person to make them nude, even without AI, it amounts to sexually violating the person in the original image. Even if you’re just photoshopping their face into porn, that’s fucking vile and I see no reason there shouldn’t be real consequences for it - especially if these images are shared with others.

Nobody defends this shit like you are unless they are doing it themselves. With that said, reevaluate yourself and stop sexually violating women.

BreakDecks,

It sounded like they thought the AI would literally reveal their body tho, which simply isn’t true.

They didn’t say anything like that. You can go back and re-read the thread yourself. If you were wrong, own up to it, but absolutely fuck off with this “well, ackshually” troll response.

If you view sexual assault as a form of free speech, expect to be treated for the kind of person you’re telling everyone that you are.

BreakDecks,

You didn’t have to come out and tell everyone that you’re one of those guys who doesn’t understand the concept of sexual exploitation and consent.

It literally doesn’t matter what you call this. Colloquially the technology is known as “Generative AI”, and it is fully automating the task of making fake nudes to the point that shady websites only require a single input image, and with a few layers of machine learning, are able to spit out a convincing nude.

It was just as fucked up when perverts sexually exploited people with Photoshop, so I don’t understand what your point here is. “AI” has made sexual exploitation fully automated, and there’s absolutely no excuse for defending this.

BreakDecks,

Sharing this screenshot again, to drive the point home.

https://lemmy.ml/pictrs/image/8ffa78b2-7400-4948-9093-a8d20319eaef.jpeg

BreakDecks,

This was from a test I did with a throwaway account on IG where I followed a handful of weirdo parents who run “model” accounts for their kids to see if Instagram would start pushing problematic content as a result (spoiler: yes they will).

It took about 5 minutes from creating the account to end up with nothing but dressed down kids on my recommendations page paired with inappropriate ads. I guess the people who follow kids on IG also like these recommended photos, and the algorithm also figures they must be perverts, but doesn’t care about the sickening juxtaposition of children in swimsuits next to AI nudifying apps.

Don’t use Meta products. They don’t care about ethics, just profits.

BreakDecks,

Send 'em to Zuck.

BreakDecks,

This 100%. I can’t even bring myself to buy new content for my Quest now that I’m aware of the issues (no matter how much I want the latest Beat Saber and Synth Riders DLC), especially since Meta’s Horizon, in my experience, puts adults into direct contact with children. At first I just dismissed metaverse games like VRChat or Horizon as being too popular with kids for me to enjoy it, but now I realize that it put me, an adult, straight into voice chats with tweens, which people should fucking know better than to do. My first thought was to log off because I wasn’t having fun in a kid-dominated space, but I have no doubt that these apps are crawling with creeps who see that as a feature rather than a problem.

We need education for parents that sharing pictures of their kids online comes with real risks, as does giving kids free reign to use the Internet. The laissez faire attitude many people have towards social media needs to be corrected, because real harm is already being done.

Most of the parents that post untoward pics of their kids online are chasing down opportunities for their kids to model, and they’re ignoring the fact that a significant volume of engagement these photos receive comes from people objectifying children. There seems to be a pattern that the most revealing outfits get the most engagement, and so future pictures are equally if not more revealing to chase more engagement…

Parents might not understand how disturbing these patterns are until they’ve already dumped thousands of pictures online, and at that point they’re likely to be in denial about what they’re exposing their kids to, and/or too invested to want to reverse course.

We also need to have a larger conversation, as a society, about using kids as models at all. Pretty much every major manufacturer of children’s clothing is hiring real kids to model the clothes. I don’t think it’s necessary to be publishing that many pictures of kids online, nor is it acceptable to be doing so for profit. There’s no reason not to limit modeling to adults who can consent to putting their bodies on public display, and using mannequins for kids’ clothing. The sheer volume of kids’ swimsuit and underwear pictures hosted on e-commerce sites is likely a contributor to the capability Generative AI models have to create inappropriate images of children, not to mention the actual CSAM found in the LAION dataset most of these models are trained on.

Sorry for the long rant, this shit pisses me off. I need to consider sending 404 Media everything I know since they’re doing investigations into this kind of thing. My small scale investigation has revealed a lot to me, but more people need to be getting as upset as I am about it if we want to make the Internet less of a hellscape.

BreakDecks,

I logged into my throwaway account today just to check in on it since people are talking about this shit more. I was immediately greeted with an ad featuring hardcore pornography, among the pics of kids that still populate my feed.

I’ll spare you the screenshot, but IG is fucked.

BreakDecks,

Generative AI is being used quite prominently for the purposes of making nonconsensual pornography. Just look at the state of CivitAI, the largest marketplace of Stable Diffusion models online. It pretends to be a community for Machine Learning professionals, but behind the scenes it’s laying the groundwork for all of the problems we’re seeing right now. There’s not an actress or female celebrity that doesn’t have a TI or LoRA trained on their likeness - and the galleries don’t hold back on showing you what these models can do.

At least Photoshop never gained the specific reputation of being a tool for making fake porn, but the GenAI community is leaving no doubt that this is a major use case for image models.

Even HuggingFace turns a blind eye to pornifying models and lolicon datasets, and they’re basically the GitHub of AI models…

BreakDecks,

Yeah, let’s just sexually violate everyone. /s

Who the hell is upvoting this awful take? Please understand that it would never be equitable. If this became reality, it would be women and girls that were exploited the most viciously.

I guess if you don’t give a shit about people, especially women and girls, feeling safe in public at all, you would say something like this…

BreakDecks,

Sounds more like he’s saying there’s assholes on both sides.

BreakDecks,

No president in history has been removed from office by impeachment. Even Nixon resigned before he could be removed. Clinton served the rest of his term after impeachment. So did Trump, after being impeached twice.

BreakDecks,

No shit the VPN requires an open port, I never said otherwise, but if your router is the one running the server, you aren’t forwarding the port. The router itself is listening on its WAN interface.

The VPN prevents you from having to forward any ports, because the router allows you to tunnel in. The only open port will be whatever port the VPN server listens on, and it isn’t a forwarded port.

Source: I literally work at a VPN company.

BreakDecks,

3/92 on virus total is a great result. The only scanners reporting a problem are the ones that are always wrong.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • megavids
  • thenastyranch
  • DreamBathrooms
  • ethstaker
  • InstantRegret
  • Youngstown
  • magazineikmin
  • osvaldo12
  • slotface
  • khanakhh
  • rosin
  • kavyap
  • everett
  • tacticalgear
  • JUstTest
  • ngwrru68w68
  • GTA5RPClips
  • cisconetworking
  • mdbf
  • cubers
  • Durango
  • anitta
  • modclub
  • normalnudes
  • tester
  • Leos
  • provamag3
  • lostlight
  • All magazines