ancoraunamoka

@ancoraunamoka@lemmy.dbzer0.com

This profile is from a federated server and may be incomplete. Browse more on the original instance.

ancoraunamoka,

I started because I wanted to get around censorship in my country. I also wanted to view stuff in the original language and here we dub everything.

ancoraunamoka,

Can someone explain to me how this movie stand in the whole godzilla franchise?

Suppose I am a guy who only watched the original godzilla movies from 85

ancoraunamoka,

The only good reply in the thread. Thanks for saying this

ancoraunamoka,

Are those your own blurays? Then share them before compressing.

Transcoding is hard. There is no way that your transcoding settings are going to be a one size fits all. I am currently encoding the famous iKaos Dragonball release and I did 48 samples before deciding what configuration to use.

You are better off downloading stuff from torrent, especially for newer media. You’ll find a community that put 100x your time collectively on transcoding. That will also save from your tremendous electricity costs.

Also look into vmaf for quality metrics. Consider that switching to uncompressed 1080 might bring you close to your goal with very very low effort.

Btw, can you share the title list?

ancoraunamoka,

yeah, I would redownload all of those instead of transcoding. They are all available with very good encodes publicly

How should I do backups?

I have a server running Debian with 24 TB of storage. I would ideally like to back up all of it, though much of it is torrents, so only the ones with low seeders really need backed up. I know about the 321 rule but it sounds like it would be expensive. What do you do for backups? Also if anyone uses tape drives for backups I am...

ancoraunamoka,

I am simple man s I use rsync.

Setup a mergerfs drive pool of about 60 TiB and rsync weekly.

Rsync seems daunting at first but then you realize how powerful and most importantly reliable it is.

It’s important that you try to restore your backups from time to time.

One of the main reasons why I avoid softwares such as Kopia or Borg or Restic or whatever is in fashion:

  • they go unmantained
  • they are not simple: so many of my frienda struggled restoring backups because you are not dealing with files anymore, but encrypted or compressed blobs
  • rsync has an easy mental model and has extremely good defaults
ancoraunamoka,

what other people are saying, is that you rsync over an encrypted file system or other type of storages. What are your backup targets? in my case I own the disks so I use LUKS partition -> ext4 -> mergerfs to end up with a single volume I can mount on a folder

ancoraunamoka,

how does this look safer for rsync? For me it looks like the risk for that is similar, but I might not know background of development for these.

Rsync is available out of the box in most linux distro and is used widely not only for backups, but a lot of other things, such as repository updates and transfers from file hosts. This means a lot more people are interested in it. Also the implementation, looking at the source code, is cleaner and easier to understand.

how do you deal with it when just a file changes?

I think you should consider that not all files are equal. Rsync for me is great because I end up with a bunch of disks that contain an exact copy of the files I have on my own server. Those files don’t change frequently, they are movies, pictures, songs and so on.

Other files such as code, configuration, files on my smartphone, etc… are backup up differently. I use git for most stuff that fits its model, syncthing for my temporary folders and my mobile phone.

Not every file can suit the same backup model. I trust that files that get corrupted or lost are in my weekly rsync backup. A configuration file I messed up two minutes ago is on git.

ancoraunamoka,

As long as you understand that simply syncing files does not protect against accidental or malicious data loss like incremental backups do.

Can you show me a scenario? I don’t understand how incremental backups cover malicious data loss cases

ancoraunamoka,

Going unmaintained is a non issue, since you can still restore from your backup. It is not like a subscription or proprietary software which is no longer usable when you stop to pay for it or the company owning goes down.

Until they hit a hard bug or don’t support newer transport formats or scenarios. Also the community dries up eventually

ancoraunamoka,

It is unrealiatic, that in a stable software release there is suddenly, after you tested your backup a hard bug which prevents recovery.

How is unrealistic? Think of this:

  • day 1: you backup your files, test the backup and everything is fine
  • day 2: you store a new file that triggers a bug in the compression/encryption algorithm of whatever software you use, now backups are corrupted at least for this file Unless you test every backup you do, and consequently can’t backup fast enough, I don’t see how you can predict that future files and situations won’t trigger bugs in a software
ancoraunamoka,

Fellow italian pirate here, using Gentoo for servers and laptop since 2014. Very interesting, thank you for sharing. Would love to have a chat someday

I just deleted my entire library and redownloaded it.

A lot of my files were shitty 480p versions of movies from the Napster days. Now they’re all 1080p, with a few 720p exceptions (mainly tv series episodes). All in all 500 something files in total. Now just watching uTorrent slowly download them all. Hopefully my VPN keeps the eyes off of me…

ancoraunamoka,

Your question is so generic that it is difficult to reply. I’ll tell you about my use case then so that you can try to figure out yours.

My goal is to be a respectful citizen. I divide my torrents in three categories:

  • rare stuff: for example project 4k77 or the John Wick regrades or Rashomon
  • italian stuff: it can be either popular stuff and also rare stuff; italian content is not seeded much so I need to do my part
  • common/popular stuff: for example the barbie movie or every marvel stuff

I bought tons of space (recently converted to three drives, 20tb each) and use a virtual machine locked behind a vpn. Even if I forget to paid, the virtual machine is bind to the tunnel so that traffic doesn’t go out except for LAN, so no leaks.

The VM has two torrent client:

  • qbittorrent: seed the torrents in the common / popular categories, speed capped to 1/3 of my bandwidth
  • transmission (previously using rtorrent) for the other two categories

I tend to leave everything in transmission seeded forever, the stuff in qbittorrent seeded until 2.5 ratio or 4.0 depending on my mood.

At the moment I have 90.2 ration on transmission and many many many TB of uploaded stuff. That should be enough to feel like you are giving back

ancoraunamoka,

qbit manage

that is an interesting advice. Regarding containers, they don’t fit my use case.

I’ve just released Gatekeeper 1.6.0. It’s a single executable that turns any Linux machine into a home gateway. Now with realtime traffic graphs, LAN autoconfiguration, full cone NAT and better looks. (github.com)

Hi all home network administrators :) Haven’t posted anything here since June, when I told you about Gatekeeper 1.1.0. Back then it was a pretty bare-bones (and maybe slightly buggy) DNS + DHCP server with a web UI with a list of LAN clients. Back at 1.1.0 Gatekeeper didn’t even configure your LAN interface or set up NAT...

ancoraunamoka,

Very interesting project, thanks for sharing and working on this. I am actually one of your target user, where I have enough knowledge to implement my own router, at the moment running on gentoo.

I would like to use this but it lacks port forwarding and a firewall, that is a must. I’ll try it out nevertheless. I’m quite impressed by the stylish HTML graphics, and I appreciate your departure from the typical “modern” gray corporate Bootstrap UI design. It’s really, really cool.

One question. how do you envision exposing this service to the internet? I quite despise rust but I wonder if the use of a memory safe language would help with the inevitable bugs, especially if you put even more features into gatekeeper.

ancoraunamoka,

thank you for the reply. All the stuff you wrote makes sense.

But even if I obtain a LetsEncrypt cert, any LAN device can do the same thing, so the whole TLS can still be MITM-ed.

can you elaborate?

ancoraunamoka,

interesting. This could all be solved if gatekeeper doesn’t allow port redirection on 80 unless explicitly configured by the administrator, right?

ancoraunamoka,

you are literally just posting buzzwords. You can be lean with mysql, you can write bloaty programs with rust. I would argue most rust webservices are shittier than java ones

ancoraunamoka,

Fantastic Four Batgirl (I can dream 😂)

can you explain more?

ancoraunamoka,

My point: if you’re getting started selfhosting you have to embrace and accept the self-inflicted punishment. Good luck everybody, I don’t know if I can keep choosing to get disappointed.

I would say that your self inflicted punishment is using windows. Switch to debian and thank me in six months

  • All
  • Subscribed
  • Moderated
  • Favorites
  • anitta
  • InstantRegret
  • mdbf
  • ngwrru68w68
  • magazineikmin
  • thenastyranch
  • rosin
  • khanakhh
  • osvaldo12
  • Youngstown
  • slotface
  • Durango
  • kavyap
  • DreamBathrooms
  • JUstTest
  • tacticalgear
  • ethstaker
  • provamag3
  • cisconetworking
  • tester
  • GTA5RPClips
  • cubers
  • everett
  • modclub
  • megavids
  • normalnudes
  • Leos
  • lostlight
  • All magazines