PSA: You can upload images to a Lemmy instance without anyone knowing

EDIT

TO EVERYONE ASKING TO OPEN AN ISSUE ON GITHUB, IT HAS BEEN OPEN SINCE JULY 6: github.com/LemmyNet/lemmy/issues/3504

June 24 - github.com/LemmyNet/lemmy/issues/3236

TO EVERYONE SAYING THAT THIS IS NOT A CONCERN: Everybody has different laws in their countries (in other words, not everyone is American), and whether or not an admin is liable for such content residing in their servers without their knowledge, don’t you think it’s still an issue anyway? Are you not bothered by the fact that somebody could be sharing illegal images from your server without you ever knowing? Is that okay with you? OR are you only saying this because you’re NOT an admin? Different admins have already responded in the comments and have suggested ways to solve the problem because they are genuinely concerned about this problem as much as I am. Thank you to all the hard working admins. I appreciate and love you all.


ORIGINAL POST

You can upload images to a Lemmy instance without anyone knowing that the image is there if the admins are not regularly checking their pictrs database.

To do this, you create a post on any Lemmy instance, upload an image, and never click the “Create” button. The post is never created but the image is uploaded. Because the post isn’t created, nobody knows that the image is uploaded.

You can also go to any post, upload a picture in the comment, copy the URL and never post the comment. You can also upload an image as your avatar or banner and just close the tab. The image will still reside in the server.

You can (possibly) do the same with community icons and banners.

Why does this matter?

Because anyone can upload illegal images without the admin knowing and the admin will be liable for it. With everything that has been going on lately, I wanted to remind all of you about this. Don’t think that disabling cache is enough. Bad actors can secretly stash illegal images on your Lemmy instance if you aren’t checking!

These bad actors can then share these links around and you would never know! They can report it to the FBI and if you haven’t taken it down (because you did not know) for a certain period, say goodbye to your instance and see you in court.

Only your backend admins who have access to the database (or object storage or whatever) can check this, meaning non-backend admins and moderators WILL NOT BE ABLE TO MONITOR THESE, and regular users WILL NOT BE ABLE TO REPORT THESE.

Aren’t these images deleted if they aren’t used for the post/comment/banner/avatar/icon?

NOPE! The image actually stays uploaded! Lemmy doesn’t check if the images are used! Try it out yourself. Just make sure to copy the link by copying the link text or copying it by clicking the image then “copy image link”.

How come this hasn’t been addressed before?

I don’t know. I am fairly certain that this has been brought up before. Nobody paid attention but I’m bringing it up again after all the shit that happened in the past week. I can’t even find it on the GitHub issue tracker.

I’m an instance administrator, what the fuck do I do?

Check your pictrs images (good luck) or nuke it. Disable pictrs, restrict sign ups, or watch your database like a hawk. You can also delete your instance.

Good luck.

homesnatch,

Because anyone can upload illegal images without the admin knowing and the admin will be liable for it.

The admin/company isn’t liable until it is reported to them and they don’t do anything about it… That’s how all social media sites work, Google isn’t immediately liable if you upload illegal materials to GDrive and share it anonymously.

bmygsbvur,

Doesn’t change the fact that this is an issue. Besides, do you think American law applies everywhere?

AphoticDev,

This can be solved very easily by a cron job to clean out the folder periodically, if you’re worried about it.

bmygsbvur,

Very easily you say? Maybe tell us what this cron job is so we can all add it?

AphoticDev,

Just make a cron that runs the rm command every day or whatever to clean out the files. Then run a SQL query at the same time to truncate any draft posts in the database. There’s no logic to this method, it just clears out the files and records related to draft posts, but it’s fast and effective.

There’s a small chance it might fuck somebody up if they were writing a post at that exact moment, but you can schedule the cron for when your instance is the quietest.

mint,
@mint@ryona.agency avatar

@bmygsbvur Pleroma is exactly the same and no one cared in six years.

bmygsbvur,

Doesn’t change the fact that this is an issue.

rektifier,

Wasn’t facebook also found to store images that were uploaded but not posted? This is just a resource leak . I can’t believe no one has mentioned this phrase yet. I’m more concerned about DoS attacks that fill up the instance’s storage with unused images. I think the issue of illegal content is being blown out of proportion. As long as it’s removed promptly (I believe the standard is 1 hour) when the mods/admins learn about it, there should be no liabilities. Otherwise every site that allows users to post media would be dead by now.

bmygsbvur,

Whether it’s illegal content or storage-filling DoS attacks, the issue needs to be addressed.

newline,

I’m a pentester and security consultant. From my point of view, this vulnerability has more impact than just a resource leak or DOS. We all know how often CSAM or other illegal material is uploaded to communities here as actual posts (where hundreds of viewers run into it to report it). Now imagine them uploading it and spreading it like this, and only the admin can catch it if they goes out of their way to check it?

I wouldn’t call this a high risk issue for sure. But a significant security risk regardless.

Kecessa,

Pedo trolls will be the death of Lemmy, you heard it here first!

bmygsbvur,

Which is why we need to act now.

Serinus, (edited )

Part of the problem with having an illegal series of bits. Of course people are going to use that as a weapon.

I don’t think those images should be made fully legal, but maybe we should calm the fuck down about two notches. We should keep in mind that the real crime is creating the pictures. Being effectively legal bombed by them is kind of ridiculous. As is having to keep the detection tools secret.

If you’re on a grand jury for csam, maybe you should actually see the evidence (with limited censorship) before you indict someone.

Maybe I’m wrong, but I don’t think seeing a small number of pictures is going to scar you for life. I’ve seen goatse. I’ve seen people decapitated. It’s not pleasant, and I avoid those things, but it’s not scarring.

The Station Nightclub Fire is scarring. I’ve recommended that video to people because it’s scarring in a way that can save lives. Seeing that stuff every day would absolutely be scarring.

I don’t want to see that kind of stuff to become common, but I am disturbed that people are afraid of unused images hiding on their Lemmy server.

Kecessa,

Fuck you, pedo.

SuddenDownpour,

Regardless of the debate of whether admins should be legally liable for not deleting unknown child abuse digital files,

Maybe I’m wrong, but I don’t think seeing a small number of pictures is going to scar you for life. I’ve seen goatse. I’ve seen people decapitated. It’s not pleasant, and I avoid those things, but it’s not scarring.

You shouldn’t use your own experiences to make this generalisation, given that people working at agencies prosecuting pederasts often have to receive therapy or even leave the job after continued exposure.

I am disturbed that people are afraid of unused images hiding on their Lemmy server.

Don’t you think it’s logical for someone to be worried about being vulnerable to being accused of what likely is, in many legal systems, a crime?

Serinus,

Yeah, I think continued exposure is different than a one off thing. It’s why I used the Grand Jury example.

And I do think it’s logical. That’s the problem. My entire point is that csam shouldn’t be so easy to weaponize.

Maybe seeking, selling, or intentionally distributing should be the crime.

Xylight,

FYI this requires a JWT so if registrations are closed on your instance you don’t have to worry

bmygsbvur,

This is for public instances.

ipkpjersi,

It seems like self-hosting your own Lemmy instance with registrations, communities, and pretty much anything else turned off is still very safe to do. I still want to end up self-hosting my own Lemmy instance some time when I have more time. Though I’d rather wait for things to be more stable first, there’s bugs I’d like to be ironed out before doing that probably, like one example is I still find it annoying that upvoting a comment in a thread deletes whatever comment you’re currently typing.

Rentlar,

How come this hasn’t been addressed before?

Because pictrs and most other components of Lemmy was designed for a much smaller use case by a very small development team. It was designed primarily by people volunteering their time and expertise. Most of the contributors have other things to do on a full-time basis. If you really want to see a change like this implemented NOW, then code it in yourself, file a new issue directly on their page with potential solutions, or donate to the people working on it.

Your post is good for the most part, but my patience is limited for the kind of entitled attitude you show under that heading specifically. Thanks for hearing me out.

habanhero,

OP is flagging a legitimate issue that can actually put instance owners at risk. Raising the issue that instance owners can unwittingly host illegal content and be liable for it - how is that entitled?

Totally understand that Lemmy devs are a small team, but the growth of use of the software is exploding now, and not being able to keep up is a problem of scale - gatekeeping others from raising issues does not help it get better and in fact discourages issue reports and promotes a head-in-the-sand culture.

Rentlar, (edited )

I understand and raising the issue and discussion is fine. With all due respect to OP, I take it personally when the discussion is framed with the implication that the developers should not have released a project with some bugs and they should have put more effort here or there. I’ve contributed to Lemmy both in coding, translation and small donations, but I’m not here for people to push blame on devs. This is why bringing up the question “Why hasn’t anything been done?”, while I recognize it is a question on some people’s minds, it gets on my nerves. It bothers me like a clickbait/ragebait title does for many.

I would rather the discussion focus on where efforts are made or will be made to mitigate and fix the problem.

Coki91,
@Coki91@dormi.zone avatar

Entitlement? The “Subtitles” are acting as a Question the reader may have, and below the answer, OP is not demanding anything

Rentlar,

Fair point. That question itself is what bothers me even if it is a valid one people have on their minds. The answer to that question should highlight more clearly what has been done, and if OP doesn’t know, then IMO it would best be to not include that question/answer.

I have no problems with OP’s post and the fact to bring up this issue and dicsuss it. Including that question with an incomplete answer bothers me like a clickbait headline for an article does, or how Tucker Carlson’s show asks questions. This serves little purpose but put the people working on fixes in a bad light acting like they haven’t been working on anything.

bmygsbvur,

Entitled attitude? I’m just bringing it up again. It was brought up some time ago but wasn’t given attention so I’m bringing it up again after the recent CSAM attacks.

I didn’t demand anything in the post. I brought up the issue, explained why it’s important, and what admins could do about it.

I don’t know how to code but that doesn’t mean I’m not allowed to bring this issue to light…

Rentlar, (edited )

I have no issue with your post itself and discussing this issue it is important to highlight things like this. Thank you for bringing it up, and sorry if I sound mad at you for doing that.

I will point out, the specific thing that bothers me is that the heading

How come this hasn’t been addressed before?

contains an incomplete answer that ignores work that is currently in progress by devs to address. I don’t blame you for not knowing the answer but for including and answering that question when you don’t know the answer. To me it’s reminiscent of Tucker Carlson-style questioning, where some issue is brought up, questions are asked but then the answer is sparsely researched and the viewer is expected to come to some conclusion of who to blame. This specifically is what gets on my nerves.

If you can include where work to rectify the issue had been discussed and is in progress like github issues, discussion throughout Lemmy and other things, I’ll edit my first reply to note my concern is assuaged.

E: Here are some of the relevant issues and discussion:

bmygsbvur,

I don’t care if you don’t like my English writing. I brought up the issue and if people don’t care about it then whatever. We’ll just have to wait until it’s abused then maybe people will be actually concerned.

bermuda,

I’m glad to live in a world where concern about safety is considered entitlement somehow

Aux,

Man, Lemmy devs have zero clue about best practices… What a crap show!

Ajen,

These kinds of issues are common on any large software project.

Aux,

Mmm, not really.

Ajen,

Are you speaking from experience?

Zeth0s,

Did you opened an issue on github?

You are wording this as a clickbait news article.

You find an issue, you report it to the right channel, you notify it. Good. This is how software development work, with active community reporting issues.

But why using such tone?

bmygsbvur,

I’m not on GitHub. Nor is a lot here. I’m wording it this way so the issue gets the attention it deserves. Anyway, everybody already knows about this but nobody understood the consequences. Same reason why there’s no option to disable image caching. These issues should have been addressed the moment image uploading was made available in Lemmy. It was just overlooked because of how tiny the platform was then.

It’s funny because last month Mastodon CSAM was a hot topic in the Fediverse and people were being defensive about it. Look where we are now. Has Mastodon addressed the CSAM issue? Did they follow the recommendations made by that paper? I don’t think so. There wouldn’t be an open GitHub issue about it. Will Lemmy be like Mastodon or will it addressed the concerns of its users?

JackbyDev,
Zeth0s,

Creating a product of any size is about planning.

If you notify here, your information will be lost in 2 days. People forget, and move on to the next hot topic. Relevant stakeholders might very well completely miss this post, because they are not 24/7 on lemmy.

The way to make it more relevant is going in the place where the planning is done, i.e. Github for lemmy. Open an issue there, explain the problem and describe possible solution. Come back to lemmy, link the issue and ask people to react to it (i.e. show it is relevant for them).

This is the best way to obtain what you ask. Social media platforms are too broad and fuzzy for tracking real issues.

This is also why you see a lot of work is done on performances of sql of lemmy backend, because most issues in the past on github concerned that.

This is my suggestion. If you really care about this being implemented, open a ticket on github and follow the discussion there. If you see there is not enough traction ask help to fellow lemmings.

Suggestions for the github issue are:

  • be very specific
  • be polite
  • suggest solutions

If your solution is good, great, if not, people are more willing to think about a problem to show stranger on the internet they are wrong

bmygsbvur,

Feel free to open the issue on my behalf. I am not a software developer. You seem to know more about this. I’m just reminding people something that I and many others have observed months ago.

JackbyDev,

Signing up for GitHub and opening this issue would take about as long as making this post.

meat_popsicle,

You could’ve just done it yourself if you felt so passionate about it. Badgering people into action rarely works.

JackbyDev,

I’m not badgering, I’m demystifying the process.

Zeth0s, (edited )

I haven’t experienced myself the issue. I trust your experience, but I cannot completely reproduce/describe it, as I am not selfhosting. I couldn’t answer in case of questions from developers regarding this.

Best would be for you to report this. You can create an issue here:

github.com/LemmyNet/lemmy/issues

There is a simple template to fill, and you can copy and paste most text from this thread.

bmygsbvur,

You don’t need to selfhost to reproduce this. Anyone can do this and that’s the problem.

Ajen,

Not sure why you’re getting downvoted, since you gave clear instructions that anyone can follow to verify what you said.

bmygsbvur,

Sadly not everyone bothered to read the post and just jumped to the comments. Again its like the Mastadon CSAM issue last month. People don’t read the paper and act so defensively about it. Now Lemmy is experiencing the same problems, people suddenly act differently?? Crazy.

worsedoughnut,
@worsedoughnut@lemdro.id avatar

so the issue gets the attention it deserves.

The other best way to do this is to actually submit the issue in the appropriate location so the Lemmy devs can track and respond to it.

It’s been 7 hours, it can’t be that hard to make a github account and format this post into an actually helpful github issue.

bmygsbvur,

Because there’s already an issue dated July 6: github.com/LemmyNet/lemmy/issues/3504

Like I said, people already know about this months ago.

AphoticDev,

How would they address your concerns? The chances that one of the devs follows you is nonexistent, I would wager. Instead of using the proper channels to inform them, you did the exact opposite and posted it someplace they are almost guaranteed not to see it.

bmygsbvur,

It’s on the GitHub issue tracker already. Did you not read the post?

lukas,

Are you aware of the consequences of your actions? You didn’t inform the people who can fix this issue of the potential impact, no. You informed the Lemmy community that they can upload whatever they want, and some of them are pedophiles. Not cool at all. Responsible disclosure ain’t a thing outside of cybersecurity I suppose, though irresponsible disclosure is prevalent everywhere. Very irresponsible.

bmygsbvur,

Rogues are very keen in their profession, and know already much more than we can teach them.

sunaurus,

FYI to all admins: with the next release of pict-rs, it should be much easier to detect orphaned images, as the pict-rs database will be moved to postgresql. I am planning to build a hashtable of “in-use” images by iterating through all posts and comments by lemm.ee users (+ avatars and banners of course), and then I will iterate through all images in the pict-rs database, and if they are not in the “in-use” hash table, I will purge them.

Of course, Lemmy can be improved to handle this case better as well!

Nerd02,
@Nerd02@lemmy.basedcount.com avatar

I’m an instance administrator, what the fuck do I do?

There’s one more option. The awesome @db0 has made this tool to detect and automatically remove CSAM content from a pict-rs object storage.

github.com/db0/lemmy-safety

Xylight,

You need a GPU for that. Most $5 VPSs don’t have that.

Nerd02,
@Nerd02@lemmy.basedcount.com avatar

Yeah I know. It’s supposed to be ran from your computer, not the VPS.

Xylight,

Would I mount the the pictrs folders as a network folder locally?

Nerd02,
@Nerd02@lemmy.basedcount.com avatar

No. Unfortunately it only works with storages on object storages like S3 buckets, not with filesystem storages. Meaning it access the files remotely one at a time from the bucket, downloading them over the internet (I assume, I didn’t make this).

But the more important thing is that, as it states in the readme, no files get saved to your disk, they only stay in your RAM while they are being processed and everything is deleted right after. This is relevant because even having had CSAM on your disk at some point can put you in trouble in some countries, with this tool it never happens.

Which btw is the same reason why mounting the pict-rs folder to your local computer is probably not a good idea.

db0,
@db0@lemmy.dbzer0.com avatar

theoretically this tool could be adjusted to go via scp and read your filesystem pict-rs storage as well, Just someone has to code it.

Nerd02,
@Nerd02@lemmy.basedcount.com avatar

Interesting. That would be a nice extension, I think most small admins are using the filesystem (I know I am lol).

bmygsbvur,

This is a nice tool but orphaned images still need to be purged. Mentioned on the other thread that bad actors can upload spam to fill up object storage space.

Nerd02,
@Nerd02@lemmy.basedcount.com avatar

That is also very true. I think better tooling for that might come with the next pict-rs version, which will move the storage to a database (right now it’s in an internal ky-value storage). Hopefully that will make it easier to identify orphaned images.

newhoa,

A lot of web software does this (Github and Gmail for example). I like it but always thought it could be abused.

Send_me_nude_girls,

You mean Gmail drafts? I know from at least one case where criminals used this, they shared the Gmail account password and messaged each other only via the drafts function. So technically there was never a mail send.

bmygsbvur,

They probably have the tools to deal with it. Lemmy certainly doesn’t.

WtfEvenIsExistence,

Oh wow. I always assumed the images are deleted if you don’t submit the post.

😬

bmygsbvur,

Sadly not the case

sabreW4K3,
@sabreW4K3@lemmy.tf avatar

Perhaps someone should create a script to purge orphan images

Danc4498,

Seems like the logical fix

Matriks404,

The logical fix would be to delete them automatically when unused for longer than let’s say 24 hours. That should be in the lemmy code, and we should not depend on 3rd party utilities to do that.

Danc4498,

Well, yeah. We’re saying the same thing. A script to fix this running outside Lemmy is a quick fix. But this same process should be built into Lemmy itself.

sabreW4K3,
@sabreW4K3@lemmy.tf avatar

You can submit a patch upstream

gumball4933,

Not everyone is a developer. Users are allowed to point out issues without working on fix themselves.

lippiece,

This attitude works on sites that provide you service. They have terms of service, and have to comply. In open source, no one owes you anything.

See a problem? Either fix it or tell someone who can, or leave.

Draconic_NEO,
@Draconic_NEO@lemmy.world avatar

or tell someone who can

That’s literally exactly what they meant by:

“Users are allowed to point out issues without working on fix themselves.”

lippiece,

Other Lemmy users are not “someone who can”.

bmygsbvur,

Very much needed.

gravitas_deficiency,

Or, just tighten up the api such that uploaded pictures have a relatively short TTL unless they become attached to a post or otherwise linked somewhere.

A script is a fine stopgap measure, but we should try to treat the cause wherever possible, instead of simply addressing the symptom.

chaorace, (edited )
@chaorace@lemmy.sdf.org avatar

What’s the practical difference? In both cases you’re culling images based on whether they’re orphaned or not.

If you’re suggesting that the implementation be based on setting individual timers instead of simply validating the whole database at regular intervals, consider whether or not the complexity of such a system is actually worth the tradeoff.

“Complexity comshmexity”, you might say. “Surely it’s not a big deal!”. Well… what about an image that used to belong to a valid post that later got deleted? Guess you have to take that edge case into account and add a deletion trigger there as well! But what if there were other comments/posts on the same instance hotlinking the same image? Guess you have to scan the whole DB every time before running the deletion trigger to be safe! Wait… wasn’t the whole purpose of setting this up with individual jobs to avoid doing a scripted DB scan?

gravitas_deficiency,

There are mechanisms that exist in a LOT of services for handling TTL expiry and any relevant purging that needs to be done.

That said, a cursory look at the pict-rs project doesn’t appear to have any provision for TTL, so it’s probably going to have to be done as a cron job anyways - or at least triggered by the lemmy service when an image upload isn’t used in an instance-local lemmy post within some reasonable interval.

Note that I’m specifically including “in an an instance-local post” because I am assuming admins don’t want to provide free cloud image hosting to random internet people for arbitrary non-lemmy use.

chaorace,
@chaorace@lemmy.sdf.org avatar

Note that I’m specifically including “in an an instance-local post” because I am assuming admins don’t want to provide free cloud image hosting to random internet people for arbitrary non-lemmy use.

Note that I at no point allude to hotlinking from outside of the instance. Unless you want it to be possible to create an image post, delete the post, and then have an orphaned image forever (thereby creating an attack vector), you do need to solve that problem. If you solve that problem without considering crossposts and comment hotlinks within the scope of your own instance, you’re going to cause breakage. If you’re forced to consider these things before triggering the deletion regardless, then you’re not saving much on performance.

PuppyOSAndCoffee,

wouldn’t it be just as easy to whitelist DNS?

cwagner,

deleted_by_author

  • Loading...
  • PuppyOSAndCoffee,

    An option to prevent users to upload unless their DNS has been whitelisted. It would require explicit permission to upload, which could be handy for smaller instances.

    cwagner, (edited )

    deleted_by_author

  • Loading...
  • PuppyOSAndCoffee,

    Not IP. DNS whitelist. This way if a geography or subnet is responsible for illegal material they are only allowed in if an instance granted +w.

    cwagner,

    deleted_by_author

  • Loading...
  • PuppyOSAndCoffee,

    Every person on the internet has a DNS record that loops back to them. The DNS has a topography so that various elements of a domain could be whitelisted, or not.

    It would be trivial to queue a request to white list, where an administrator could decide if it is worth it, having it auto expire over time.

    Instance admins could share sources of bad actors.

    heuristics could help determine the risk of an approval action.

    cwagner,

    deleted_by_author

  • Loading...
  • PuppyOSAndCoffee,

    That’s how the internet works? Every device on the internet has an IP address and most IP devices are assigned a unique DNS name for that address.

    That DNS has a topography (a.b.c.com) so that you could whitelist (*.b.c.com). Mobile Devices, Home Networks, College & Corporate campuses…all are probably going to have a DNS associated with then. Entire swathes of the internet could be whitelisted fairly easily…or not.

    Yes it would not be fool proof. However IP whitelisting would rapidly lose its meaning.

    cwagner,

    deleted_by_author

  • Loading...
  • PuppyOSAndCoffee,

    fwiw the wide area / global internet would fall apart without DNS.

    Local networking ignores DNS for the most part. Your home router, for example, doesn’t need a DNS to route packets from your ISP to your device. or IP for that matter. Ethernet alone is enough.

    DNS itself is unrelated to geography.

    Geography at best is an ISP convention.

    Most geographic maps utilize physical routing to guess a DNS location. At best a DNS relationship to geography is a finger in the air, many times it’s just wrong.

    if someone had no DNS it is more likely they are up to no good when accessing Lemmy. It is not foolproof however a smaller instance would be reasonably protected.

    They, I imagine, care less about who the user is (username) and more about the user’s internet gateway (DNS). Only the instances that woke up to GB of child pornography know how they got there.

    There is a cpu hit for this sort of thing so it is not free.

    bmygsbvur,

    Explain.

    Ricaz,

    He probably means whitelisting domains when posting already uploaded images, clearly not having read the post

    PuppyOSAndCoffee,

    No I mean the user’s DNS should be whitelisted to permit uploads. If DNS not on whitelist then no upload, period.

    Ricaz,

    What do you mean by “the user’s DNS” exactly??

    bmygsbvur,

    That’s another issue. Also a necessary feature.

    Krapulaolut,

    Yes it would, if the problem had anything to do with the DNS.

    PuppyOSAndCoffee,

    The problem is people and people have DNS.

    shagie,

    Try turning wifi off on your phone, getting the IP address, and then looking up the DNS entry for that and consider if you want to whitelist that? And then do this again tomorrow and check to see if it has a different value.

    Once you get to the point of “whitelist everything in *.mobile.att.net” it becomes pointless to maintain that as a whitelist.

    Likewise *.dhcp.dorm.college.edu is not useful to whitelist.

    PuppyOSAndCoffee,

    Yes. I am well aware and that would be by design.

    remember - if someone on a major mobile network is uploading child photography, that device is radioactive and an instance admin is going to have options they may not have in other situations.

    The idea is give instance admins control over who uploads content. Perhaps they don’t want mobile users to upload content, or perhaps they do but only major carriers, by their own definition of major.

    Somewhere between “everyone” and “nobody” is an answer.

    giving the instance administrator tools to help quarantine bad actors only helps, which will require layers. Reverse DNS is a cost, however; perhaps the tax is worth it when hosting images, where there is already a pause point in the end user experience, and the ramifications so severe.

    Larger instances may dilligaf but a smaller instance may need to be very careful…

    Just sayin…

    shagie,

    Do you have a good and reasonable reverse DNS entry for the device you’re writing this from?

    FWIW, my home network comes nat’ed out as {ip-addr}.res.provider.com.

    Under your approach, I wouldn’t have any system that I’d be able to upload a photo from.

    PuppyOSAndCoffee,

    why do you say that, knowing full well DNS whitelists rely on wildcards?

    shagie,

    If you’re whitelisting *.res.provider.com and *.mobile.att.com the whitelist is rather meaningless because you’ve whitelisted almost everything.

    If you are not going to whitelist those, do you have any systems available to you (because I don’t) that would pass a theoretical whitelist that you set up?

    PuppyOSAndCoffee,

    Why does it matter? Read some of my other posts.

    shagie,

    Would you be able to post an image if neither *.res.provider.com nor *.mobile.att.com were whitelisted and putting 10-11-23-45.res.provider.com (and whatever it will be tomorrow) was considered to be too onerous to put in the whitelist each time your address changed?

    PuppyOSAndCoffee,

    Why wouldn’t you whitelist *.mobile.att.net?

    shagie,

    If you have whitelisted *.mobile.att.net you’ve whitelisted a significant portion of the mobile devices in the US with no ability to say “someone in Chicago is posting problematic content”.

    You’ve whitelisted 4.6 million IPv4 addresses and 7.35 x 10^28^ IPv6 addresses.

    Why have a whitelist at all then?

    PuppyOSAndCoffee,

    We should assume most users are not going to be criminals.

    For users who are criminals, instance administrators are on the hook to help the feds catch the bad guys. If the bad guys are using mobile devices, it’s a slam dunk right?

    100% in a democratically elected country governed by the rule of law, an instance admin should be handing over logs to their regional authorities to pursue child pornographers to the full extent possible.

    One assumes most child pornographers know this, so they are going to use other methods to mask their IP. Those other methods are going to be filtered out by DNS whitelist - even if a user utilized a mobile device 99% of the time, the minute they try to anonymize their criminal content, it’s more likely to be gated right at the source.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • lemmy@lemmy.ml
  • DreamBathrooms
  • magazineikmin
  • ethstaker
  • khanakhh
  • rosin
  • Youngstown
  • everett
  • slotface
  • ngwrru68w68
  • mdbf
  • GTA5RPClips
  • kavyap
  • thenastyranch
  • cisconetworking
  • JUstTest
  • cubers
  • Leos
  • InstantRegret
  • Durango
  • tacticalgear
  • tester
  • osvaldo12
  • normalnudes
  • anitta
  • modclub
  • megavids
  • provamag3
  • lostlight
  • All magazines