Instagram Advertises Nonconsensual AI Nude Apps

Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.

While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.

Shadywack,
@Shadywack@lemmy.world avatar

So, if the AI generated tits look real, but they’re not HER tits, is it just less terrible?

CaptainEffort,

I mean literally yes, but that doesn’t mean it isn’t still horrible

intensely_human,

This is not okay, but this is nowhere near the most harmful application of AI.

The most harmful application of AI that I can think of would disrupting a country’s entire culture via gaslighting social media bots, leading to increases in addiction, hatred, suicide, and murder.

Putting hundreds of millions of people into a state of hopeless depression would be more harmful than creating a picture of a naked woman with a real woman’s face on it.

Katrisia,

I don’t want to fall into a slippery slope argument, but I really see this as the tip of a horrible iceberg. Seeing women as sexual objects starts with this kind of non consensual media, but also includes non consensual approaches (like a man that thinks he can subtly touch women in full public transport and excuse himself with the lack of space), sexual harassment, sexual abuse, forced prostitution (it’s hard to know for sure, but possibly the majority of prostitution), human trafficking (in which 75%-79% go into forced prostitution, which causes that human trafficking is mostly done to women), and even other forms of violence, torture, murder, etc.

Thus, women live their lives in fear (in varying degrees depending on their country and circumstances). They are restricted in many ways. All of this even in first world countries. For example, homeless women fearing going to shelters because of the situation with SA and trafficking that exists there; women retiring from or not entering jobs (military, scientific exploration, etc.) because of their hostile sexual environment; being alert and often scared when alone because they can be targets, etc. I hopefully don’t need to explain the situation in third world countries, just look at what’s legal and imagine from there…

This is a reality, one that is:

Putting hundreds of millions of people into a state of hopeless depression

Again, I want to be very clear, I’m not equating these tools to the horrible things I mentioned. I’m saying that it is part of the same problem in a lighter presentation. It is the tip of the iceberg. It is a symptom of a systemic and cultural problem. The AI by itself may be less catastrophic in consequences, rarely leading to permanent damage (I can only see it being the case if the victim develops chronic or pervasive health problems by the stress of the situation, like social anxiety, or commits suicide). It is still important to acknowledge the whole machinery so we can dimension what we are facing, and to really face it because something must change. The first steps might be against this “on the surface” “not very harmful” forms of sexual violence.

Melatonin,

Isn’t that happening? I assumed it was.

M0oP0o,
@M0oP0o@mander.xyz avatar
Sami_Uso,

Capitalism works! It breeds innovation like this! good luck getting non consensual ai porn in your socialist government

intensely_human,

It’s ironic because the “free market” part of capitalism is defined by consent. Capitalism is literally “the form of economic cooperation where consent is required before goods and money change hands”.

Unfortunately, it only refers to the two primary parties to a transaction, ignoring anyone affected by externalities to the deal.

linuxPIPEpower,

the above comment was written by a person who’s lack of understanding of consent suggests they are almost certainly guilty of sex crimes.

BreakDecks,

Sharing this screenshot again, to drive the point home.

https://lemmy.ml/pictrs/image/8ffa78b2-7400-4948-9093-a8d20319eaef.jpeg

The_Tired_Horizon,
@The_Tired_Horizon@lemmy.world avatar

That makes me sick!! 😠

Google Play and co are allowing similar apps

LucidBoi,

What in the fuck are all these photos of kids? They’re not part of the ad?

BreakDecks,

This was from a test I did with a throwaway account on IG where I followed a handful of weirdo parents who run “model” accounts for their kids to see if Instagram would start pushing problematic content as a result (spoiler: yes they will).

It took about 5 minutes from creating the account to end up with nothing but dressed down kids on my recommendations page paired with inappropriate ads. I guess the people who follow kids on IG also like these recommended photos, and the algorithm also figures they must be perverts, but doesn’t care about the sickening juxtaposition of children in swimsuits next to AI nudifying apps.

Don’t use Meta products. They don’t care about ethics, just profits.

kellyaster,
@kellyaster@lemmy.world avatar

Wow, that is really disturbing. WTF, IG?

olutukko,

okay enough internet for me today

LucidBoi,

Coupled with the article about pedos blackmailing kids with their fake nudes to get real ones, this makes my stomach turn and eyes water. So much evil in this world. I am happy to say I deleted my FB and IG accounts a few days ago. WhatsApp is tough to leave due to family though… Slowly getting people to switch over to safer and more ethical alternatives.

BreakDecks,

This 100%. I can’t even bring myself to buy new content for my Quest now that I’m aware of the issues (no matter how much I want the latest Beat Saber and Synth Riders DLC), especially since Meta’s Horizon, in my experience, puts adults into direct contact with children. At first I just dismissed metaverse games like VRChat or Horizon as being too popular with kids for me to enjoy it, but now I realize that it put me, an adult, straight into voice chats with tweens, which people should fucking know better than to do. My first thought was to log off because I wasn’t having fun in a kid-dominated space, but I have no doubt that these apps are crawling with creeps who see that as a feature rather than a problem.

We need education for parents that sharing pictures of their kids online comes with real risks, as does giving kids free reign to use the Internet. The laissez faire attitude many people have towards social media needs to be corrected, because real harm is already being done.

Most of the parents that post untoward pics of their kids online are chasing down opportunities for their kids to model, and they’re ignoring the fact that a significant volume of engagement these photos receive comes from people objectifying children. There seems to be a pattern that the most revealing outfits get the most engagement, and so future pictures are equally if not more revealing to chase more engagement…

Parents might not understand how disturbing these patterns are until they’ve already dumped thousands of pictures online, and at that point they’re likely to be in denial about what they’re exposing their kids to, and/or too invested to want to reverse course.

We also need to have a larger conversation, as a society, about using kids as models at all. Pretty much every major manufacturer of children’s clothing is hiring real kids to model the clothes. I don’t think it’s necessary to be publishing that many pictures of kids online, nor is it acceptable to be doing so for profit. There’s no reason not to limit modeling to adults who can consent to putting their bodies on public display, and using mannequins for kids’ clothing. The sheer volume of kids’ swimsuit and underwear pictures hosted on e-commerce sites is likely a contributor to the capability Generative AI models have to create inappropriate images of children, not to mention the actual CSAM found in the LAION dataset most of these models are trained on.

Sorry for the long rant, this shit pisses me off. I need to consider sending 404 Media everything I know since they’re doing investigations into this kind of thing. My small scale investigation has revealed a lot to me, but more people need to be getting as upset as I am about it if we want to make the Internet less of a hellscape.

Karyoplasma,

The other day, I had an ad on facebook that was basically lolicon. It depicted a clearly underage anime girl in a sexually suggestive position on a motorcycle with their panties almost off. I am in Germany, Facebook knows I am in Germany and if I took a screenshot of that ad and saved it, it would probably be classed as CSAM in my jurisdiction. I reported the ad and got informed that FB found “nothing wrong” with it a few days later. Fuck off, you child predators.

BreakDecks,

I logged into my throwaway account today just to check in on it since people are talking about this shit more. I was immediately greeted with an ad featuring hardcore pornography, among the pics of kids that still populate my feed.

I’ll spare you the screenshot, but IG is fucked.

Eezyville,
@Eezyville@sh.itjust.works avatar

I feel like you’ve just set me up for an FBI visit.

BreakDecks,

Send 'em to Zuck.

eran_morad,

Shit ain’t right.

nobleshift,
@nobleshift@lemmy.world avatar

deleted_by_author

  • Loading...
  • Jarix,

    My only hope is it also reduces CSAM of real people. Please at least do that much!

    nobleshift,
    @nobleshift@lemmy.world avatar

    deleted_by_author

  • Loading...
  • Jarix,

    What the fuck do you mean no?! This is happening right the fuck now. Its already happening. You DONT want it to decrease the total number actual real children who are used and abused to feed this shit?

    I think you think im supporting this in some way. I AM NOT. Im saying i hope that any of the pedos out there are using this instead of taking action against actual children will not have already harmed children or at the very least reduce the total harm done.

    Christ what the hell are we coming to if we cant even try to find some fucking sanity in this situation.

    And for all that is good and right in the world i also very much hope it doesnt lead to MORE abuse.

    Can we at least hope for the best while trying to fix the worst?

    BreakDecks,

    The pedos out there are using AI to nudity pictures of real kids. That’s just going to drive up the demand for creep shots and child model photosets to exploit.

    There may be a small percentage of offending pedophiles that switch to pure GenAI over pictures of real kids, but I don’t see GenAI ever playing a role in harm reduction given the harm it ultimately enables.

    One of the current sickening trends is for a predator to convince a kid to send underwear or swimsuit pics, and then blackmail them into more hardcore photos with nudified versions of the original pics. They’re already seeing an influx of that kind of CSAM online, that involves abusing real kids on social media.

    I just wish America was less puritanical and taught kids about sex and boundaries to protect them, and that we had a functioning mental healthcare system that directly helps people who experience inappropriate sexuality attractions like pedophilia before they go down these dark paths.

    Jarix, (edited )

    Look we dont know for sure. Im grasping at silver linings made of straws. I dont care how unlikely it is to be true, but there is a chance.

    A chance that some day, months years or decades, we will find out whether or not it didnt work out in best way it could have given whats already happening. But we will get an answer that will be pretty hard to disagree with

    And i wont be surpised when it isnt what im hoping it might be. I wont be devastated or have my world view shattered.

    Im not naive, im just hoping that we are wrong. Even if is a bit rediculous and theres only evidence to the contrary along the way.

    What we know today may not be what we understand next year

    Truth is stranger than fiction. We have soo many problems now its fine if we WANT an easy win we wont be able to KNOW the answer to for AN amount of time yet. But only if we are honest with ourselves that just because we want something, doesnt mean its has to happen. I also know that today

    But holy hell my guy im fucking grasping that straw. This shit is too bleak and we need something to keep us from taking vigilant action. What we dont need is to stir the pot of fear worry and horror before its time to take action.

    If we cant see paths to better places we will have one hell of a hard time recognizing things that will help us get to that path. And if you dont agree we need a new path, it might be too late for you

    CaptainEffort,

    That’s what I was hoping too, and I’m sure for some it does, but I also saw an article the other day about people using these to blackmail children into giving real nudes, so fuck me I guess.

    I still hope that for the majority of pedos this is solely something used to deter their urges. I’d like at least that much I mean ffs.

    Jarix,

    Well fuck. The shittiest timeline

    CaptainEffort,

    No fucking kidding lmao

    HelloHotel, (edited )
    @HelloHotel@lemmy.world avatar

    The idea that the children in this photo are ment to be seen in the same context of a porn site (or at least somthing using the pornhub logo likeness) is discusting.

    DISCLAIMER: Ive havent gone throught this myself but know what porn adiction feels like. its not fun and will warp who you are on the inside.

    Anyone lured for any reason to this site, DO NOT ENGUAGE it WILL HURT YOU! If for whatever reason theve put their hooks in you and are reeling you in, Use stratigies that Alcoholics Anonimous use. LITERALLY ANYTHING is better than using pictures of REAL CHILDREN for sexual grtification.

    evlogii, (edited )

    Isn’t it kinda funny that the “most harmful applications of AI tools are not hidden on the dark corners of the internet,” yet this article is locked behind a paywall?

    intensely_human,

    The proximity of these two phrases meaning entirely opposite things indicates that this article, when interpreted as an amorphous cloud of words without syntax or grammar, is total nonsense.

    The arrogant bastards!

    Wispy2891,

    Something that can also happen: require Facebook login with some excuse, then blackmail the creeps by telling “pay us this extortion or we’re going to send proof of your creepiness to your contacts”

    intensely_human,

    Another something that can also happen: require facebook login with some excuse, plant shit on your enemy’s computer, then blackmail them by threatening to frame them as creeps.

    Sometimes the reason a method is frowned upon is that it is equally usable for evil as for good.

    Kedly,

    ITT: A bunch of creepy fuckers who dont think society should judge them for being fucking creepy

    LadyAutumn,
    @LadyAutumn@lemmy.blahaj.zone avatar

    Lot of people in this thread who don’t seem to understand what sexual exploitation is. I’ve argued about this exact subject on threads like this before.

    It is absolutely horrifying that someone you know could take your likeness and render it into a form for their own sexual gratification. It doesn’t matter that it’s ai rendered. The base image is still you, the face in the image is still your face, and you are still the object being sexualized. I can’t describe how disgusting that is. If you do not see the problem in that I don’t know what to tell you. This will be used on images of normal non-famous women. It will be used on pictures from the social media profiles of teenage girls. These ads were on a platform with millions of personal accounts of women and girls. It’s sickening. There is no consent involved here. It’s non-consensual pornography.

    Schadrach,

    The AI angle is just buzzword fearmongering though - this is something you could do with photoshop back in the 90s (and people did, usually with celebrities and with varying levels of quality).

    andros_rex,

    Middle school boys could not create realistic depiction of their classmates engaged in sex with photoshop. At least not without significant time and effort. Now they can generate hundreds of photos in a matter of minutes.

    PraiseTheSoup,

    Wrong. That took time and effort and some level of knowledge from the user, meaning the end product was still somewhat rare. We already know that a decent “AI” image generator can spit these out in seconds with zero skill or knowledge required from the user.

    JackbyDev,

    Photoshop was not advertised for its ability to make fake nudes. The purpose of Photoshop is not to make fake nudes. It is a general purpose image editor. These two things are distinctly different.

    BreakDecks,

    You didn’t have to come out and tell everyone that you’re one of those guys who doesn’t understand the concept of sexual exploitation and consent.

    It literally doesn’t matter what you call this. Colloquially the technology is known as “Generative AI”, and it is fully automating the task of making fake nudes to the point that shady websites only require a single input image, and with a few layers of machine learning, are able to spit out a convincing nude.

    It was just as fucked up when perverts sexually exploited people with Photoshop, so I don’t understand what your point here is. “AI” has made sexual exploitation fully automated, and there’s absolutely no excuse for defending this.

    Schadrach,

    so I don’t understand what your point here is

    It’s that all the articles over the last year screaming about the dangers of AI because it can be used for something an interested high school student could use an image editor to do 30 years ago but more easily and arguably at somewhat better quality (depending on the person using photoshop) are being ridiculous because they’re blaming the technology instead of the weirdo using it to doctor an image of that girl at their school and pass it around. And yes, anyone who makes and distributes on of these images of someone should be nailed for revenge porn, harassment and whatever else might apply. I say “and distributes” only because if they never distribute it no one would ever know it exists so there would be no opportunity to bust them.

    The best use (ie only good use) for one of these is to feed it an image of something that is definitely not the right kind of image for it and seeing what horrors it invents trying to fill in the blanks. Hand it your buddy with a beer belly and a mountain man beard or a dog or garden gnome something.

    BreakDecks,

    Generative AI is being used quite prominently for the purposes of making nonconsensual pornography. Just look at the state of CivitAI, the largest marketplace of Stable Diffusion models online. It pretends to be a community for Machine Learning professionals, but behind the scenes it’s laying the groundwork for all of the problems we’re seeing right now. There’s not an actress or female celebrity that doesn’t have a TI or LoRA trained on their likeness - and the galleries don’t hold back on showing you what these models can do.

    At least Photoshop never gained the specific reputation of being a tool for making fake porn, but the GenAI community is leaving no doubt that this is a major use case for image models.

    Even HuggingFace turns a blind eye to pornifying models and lolicon datasets, and they’re basically the GitHub of AI models…

    ArmokGoB,

    Knowledge of fission is often applied to make nuclear bombs, but also to generate nuclear power. We shouldn’t blame AI as a whole for this just because some creeps use it for shitty applications.

    BreakDecks, (edited )

    That’s kinda why I brought up specific key players and how I consider them complicit. If you don’t want AI to be blamed as a whole, you should want those key players to behave ethically, or they’ll poison public perception of AI as a whole.

    Mastengwe,

    “Major Social Media Company Profits From App That Creats Unauthorized Nudes! Pay Us So You Can Read About It!”

    What a shitshow.

    Midnight,

    404 Media is worker owned; you should pay them.

    dream_weasel,

    Is there such a thing as a consensual undressing app? Seems redundant

    lengau,

    Theoretically any of these apps could be used with consent.

    In practice I can’t imagine that would be a particularly large part of their market…

    Schadrach,

    Now I have this image of an OnlyFans girl who just fake nudes all her pictures. Would make doing public nudity style pictures a lot easier.

    lengau,

    Y’know, as long as she’s open about it that would be a great use of the tech.

    dream_weasel,

    “hey send me some nudes!”

    “Ugh… I’m already on the couch in my pajamas. Here’s a pic of me at the coffee shop today, just use the app, it’s close enough.”

    melpomenesclevage,

    that would just be instructions, wouldn’t it?

    reverendsteveii,

    I guess it’s really in whether you use it with consent. I used one on my own picture just to see how it worked. It gave me huge tits but other than that was scarily accurate.

    dream_weasel,

    Neat. Ladies only or does it do dudes too?

    reverendsteveii,

    I’m a dude, it’s just a clever name. It’ll do dudes, it’s just gonna give you huge tits. What you’re into is, of course, your business.

    evranch,

    My interest in this topic just went from 0 to 10 upon realizing the humour potential of passing it around to see all my bros with huge tits, but only if it worked like a Snapchat filter.

    Also I have a friend who already has huge tits, and I’ve seen them IRL so I’m curious what it would do

    Schadrach,

    Also I have a friend who already has huge tits, and I’ve seen them IRL so I’m curious what it would do

    Being serious for a moment, it depends on the source image. If it can tell where the contours of the tits are in the source image, they’ll be closer to the right size and shape - otherwise it’s going to find something it thinks are the contours and map out tits that match those, then generic torso that matches the shape of where it thinks the torso is and skintone of the face. It’s not magic, it’s just automating what a horndog with photoshop, a photo of you and a big enough porn collection to find someone with a similar body type could do back in the 90s.

    evranch,

    I’m familiar with how ML works so it’s not magic to me either, but the actual result is what would intrigue me. Since she has big naturals obviously they hang pretty heavy when they’re set free.

    But if I fed it a picture of her wearing a tight push-up bra, which could easily give off the impression that she had implants, would I get a pair of bolt-ons back? Or would it be able to pick up on the signs of real tits and add some sag?

    Seeing how it’ll put tits on men it’s obviously not an exact science lol

    chatokun,

    There isn’t, but emphasis on why it’s an issue is always a good thing to do. Same reason people get upset when some articles say “had sex with a minor” or “involved in a relationship with a minor” when the accurate crime is “raped a minor.”

    HelloHotel,
    @HelloHotel@lemmy.world avatar

    If you (the news) are going to use flowery language, at least imply its a crime!

    • "Sexually coersed a minor"
    • or "groomed a minor for sex"
    • or "had a relationship where the power dynamics were so 1 sided that the child could not give consent"
    • or mabe just say “raped a minor”

    Its not that hard!

    ehxor,

    BRB. Got an idea for a start-up

    UnderpantsWeevil,
    @UnderpantsWeevil@lemmy.world avatar

    I assume that’s what you’d call OnlyFans.

    That said, the irony of these apps is that its not the nudity that’s the problem, strictly speaking. Its taking someone’s likeness and plastering it on a digital manikin. What social media has done has become the online equivalent of going through a girl’s trash to find an old comb, pulling the hair off, and putting it on a barbie doll that you then use to jerk/jill off.

    What was the domain of 1980s perverts from comedies about awkward high schoolers has now become a commodity we’re supposed to treat as normal.

    creditCrazy,
    @creditCrazy@lemmy.world avatar

    You just took my feeling on this issue and put it to words

    CaptainEffort,

    Idk how many people are viewing this as normal, I think most of us recognize all of this as being incredibly weird and creepy.

    UnderpantsWeevil,
    @UnderpantsWeevil@lemmy.world avatar

    Idk how many people are viewing this as normal

    Maybe not “Lemmy” us. But the folks who went hog wild during The Fappening, combined with younger people who are coming into contact with pornography for the first time, make a ripe base of users who will consider this the new normal.

    CaptainEffort,

    Yeah damn, that’s true.

    An obvious answer would be to talk to younger people about it, to explain how gross and violating it is. Even if it doesn’t become illegal, there are plenty of legal things that people avoid and recognize are bad because they were taught correctly.

    Unfortunately, due to how puritan our society is, I can’t imagine many parents would be willing to talk to their kids about stuff like this.

    PiratePanPan,
    @PiratePanPan@lemmy.dbzer0.com avatar

    Lemmy rolls worst comment section, asked to leave the Fediverse

    EdibleFriend,
    @EdibleFriend@lemmy.world avatar

    youtube has been for like 6 or 7 months. even with famous people in the ads. I remember one for a while with Ortega

    CrayonRosary,
    MenacingPerson,

    I guess that’s an OERGAsm

    CrayonRosary,

    Don’t use that as lube.

    EdibleFriend,
    @EdibleFriend@lemmy.world avatar

    I sat down with tacos as I opened up that reply.

    Witch.

    dan1101, (edited )

    Yet another example of multi billion dollar companies that don’t curate their content because it’s too hard and expensive. Well too bad maybe you only profit 46 billion instead of 55 billion. Boo hoo.

    themeatbridge,

    It’s not that it’s too expensive, it’s that they don’t care. They won’t do the right thing until and unless they are forced to, or it affects their bottom line.

    KeenFlame,

    An economic entity cannot care, I don’t understand how people expect them to. They are not human

    UnderpantsWeevil,
    @UnderpantsWeevil@lemmy.world avatar

    Economic Entities aren’t robots, they’re collections of people engaged in the act of production, marketing, and distribution. If this ad/product exists, its because people made it exist deliberately.

    KeenFlame,

    No they are slaves to the entity.

    They can be replaced

    Everyone from top to bottom can be replaced

    And will be unless they obey the machine’s will

    It’s crazy talk to deny this fact because it feels wrong

    It’s just the truth and yeah, it’s wrong

    UnderpantsWeevil,
    @UnderpantsWeevil@lemmy.world avatar

    Everyone from top to bottom can be replaced

    Once you enter the actual business sector and find out how much information is siloed or sequestered in the hands of a few power users, I think you’re going to be disappointed to discover this has never been true.

    More than one business has failed because a key member of the team left, got an ill-conceived promotion, or died.

    TwilightVulpine,

    Wild that since the rise of the internet it’s like they decided advertising laws don’t apply anymore.

    But Copyright though, it absolutely does, always and everywhere.

    Aermis,

    Your example is 9 billion difference. This would not cost 9 billion. It wouldn’t even cost 1 billion.

    Bizarroland,
    Bizarroland avatar

    Yeah realistically you're talking about a team of 10 to 30 people whose entire job is to give the final thumbs up or thumbs down to an ad.

    You're talking one to three million dollars a year, maybe throw an extra million on for the VP.

    Chump change, they just don't want to pay it cuz nobody's forcing them to

    JJROKCZ,

    It would take more than 10-30 to run a content review department for any of the major social media firms, but your point still stands that it wouldn’t be a billion annually. A few 10s of millions between wages/benefits/equipment/software all combined annually

    realharo,

    Shouldn’t AI be good at detecting and flagging ads like these?

    kerrigan778,

    “Shouldn’t AI be good” nah.

    lengau,

    Build an AI that will flag immoral ads and potentially lose you revenue

    Build an AI to say you’re using AI to moderate ads but it somehow misses the most profitable bad actors

    Which do you think Meta is doing?

    UnderpantsWeevil,
    @UnderpantsWeevil@lemmy.world avatar

    Well too bad maybe you only profit 46 billion instead of 55 billion.

    I can’t possibly imagine this quality of clickbait is bringing in $9B annually.

    Maybe I’m wrong. But this feels like the sort of thing a business does when its trying to juice the same lemon for the fourth or fifth time.

    bitwaba,

    It’s not that the clickbait is bringing in $9B, it’s that it would cost $9B to moderate it.

    _sideffect,

    Good, let all celebs come together and sue zuck into the ground

    UnderpantsWeevil,
    @UnderpantsWeevil@lemmy.world avatar

    Its funny how many people leapt to the defense of Title V of the Telecommunications Act of 1996 Section 230 liability protection, as this helps shield social media firms from assuming liability for shit like this.

    Sort of the Heads-I-Win / Tails-You-Lose nature of modern business-friendly legislation and courts.

    uriel238,
    @uriel238@lemmy.blahaj.zone avatar

    Section 230 is what allows for social media at all given the problem of content moderation at scale is still unsolved. Take away 230 and no company will accept the liability. But we will have underground forums teeming with white power terrorists signalling, CSAM and spam offering better penis pills and Nigerian princes.

    The Google advertising system is also difficult to moderate at scale, but since Google makes money directly off ads, and loses money when YouTube content is not brand safe, Google tends to be harsh on content creators and lenient on advertisers.

    It’s not a new problem, and nudification software is just the latest version of X-Ray Specs (which is to say weve been hungry to see teh nekkid for a very long time.) The worst problem is when adverts install spyware or malware onto your device without your consent, which is why you need to adblock Forbes Magazine…or really just everything.

    However much of the world’s public discontent is fueled by information on the internet (Some false, some misleading, some true. A whole lot more that’s simultaneously true and heinous than we’d like in our society). So most of our officials would be glad to end Section 230 and shut down the flow of camera footage showing police brutality, or starving people in Gaza or fracking mishaps releasing gigatons of rogue methane into the atmosphere. Our officials would very much love if we’d go back to being uninformed with the news media telling us how it’s sure awful living in the Middle East.

    Without 230, we could go back to George W. Bush era methods, and just get our news critical of the White House from foreign sources, and compare the facts to see that they match, signalling our friends when we detect false propaganda.

    GreenTacklebox,

    This reminded me of those kid’s who made pornographic so videos of their classmates.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • technology@lemmy.world
  • rosin
  • DreamBathrooms
  • thenastyranch
  • magazineikmin
  • vwfavf
  • InstantRegret
  • Youngstown
  • ngwrru68w68
  • slotface
  • Durango
  • cisconetworking
  • tacticalgear
  • kavyap
  • everett
  • megavids
  • cubers
  • khanakhh
  • osvaldo12
  • mdbf
  • ethstaker
  • normalnudes
  • modclub
  • Leos
  • GTA5RPClips
  • tester
  • anitta
  • provamag3
  • JUstTest
  • All magazines