Inspired by a post since deleted, I feel bad for probably coming off judgemental about the poster’s taste in the movie that drove him to consider sailing....
Interesting problem here. So I self host jellyfin, happy to share my (owned) movies with my family. Well, my mother has asked me to digitize her collection too and have me host it. Originally, fine, you give your movies to me, I host them, same thing....
I have a collection of about ~110 4K Blu-Ray movies that I’ve ripped and I want to take the time to compress and store them for use on a future Jellyfin server....
Are those your own blurays? Then share them before compressing.
Transcoding is hard. There is no way that your transcoding settings are going to be a one size fits all. I am currently encoding the famous iKaos Dragonball release and I did 48 samples before deciding what configuration to use.
You are better off downloading stuff from torrent, especially for newer media. You’ll find a community that put 100x your time collectively on transcoding. That will also save from your tremendous electricity costs.
Also look into vmaf for quality metrics. Consider that switching to uncompressed 1080 might bring you close to your goal with very very low effort.
I have a server running Debian with 24 TB of storage. I would ideally like to back up all of it, though much of it is torrents, so only the ones with low seeders really need backed up. I know about the 321 rule but it sounds like it would be expensive. What do you do for backups? Also if anyone uses tape drives for backups I am...
Setup a mergerfs drive pool of about 60 TiB and rsync weekly.
Rsync seems daunting at first but then you realize how powerful and most importantly reliable it is.
It’s important that you try to restore your backups from time to time.
One of the main reasons why I avoid softwares such as Kopia or Borg or Restic or whatever is in fashion:
they go unmantained
they are not simple: so many of my frienda struggled restoring backups because you are not dealing with files anymore, but encrypted or compressed blobs
rsync has an easy mental model and has extremely good defaults
what other people are saying, is that you rsync over an encrypted file system or other type of storages. What are your backup targets? in my case I own the disks so I use LUKS partition -> ext4 -> mergerfs to end up with a single volume I can mount on a folder
how does this look safer for rsync? For me it looks like the risk for that is similar, but I might not know background of development for these.
Rsync is available out of the box in most linux distro and is used widely not only for backups, but a lot of other things, such as repository updates and transfers from file hosts. This means a lot more people are interested in it. Also the implementation, looking at the source code, is cleaner and easier to understand.
how do you deal with it when just a file changes?
I think you should consider that not all files are equal. Rsync for me is great because I end up with a bunch of disks that contain an exact copy of the files I have on my own server. Those files don’t change frequently, they are movies, pictures, songs and so on.
Other files such as code, configuration, files on my smartphone, etc… are backup up differently. I use git for most stuff that fits its model, syncthing for my temporary folders and my mobile phone.
Not every file can suit the same backup model. I trust that files that get corrupted or lost are in my weekly rsync backup. A configuration file I messed up two minutes ago is on git.
Going unmaintained is a non issue, since you can still restore from your backup. It is not like a subscription or proprietary software which is no longer usable when you stop to pay for it or the company owning goes down.
Until they hit a hard bug or don’t support newer transport formats or scenarios. Also the community dries up eventually
It is unrealiatic, that in a stable software release there is suddenly, after you tested your backup a hard bug which prevents recovery.
How is unrealistic? Think of this:
day 1: you backup your files, test the backup and everything is fine
day 2: you store a new file that triggers a bug in the compression/encryption algorithm of whatever software you use, now backups are corrupted at least for this file Unless you test every backup you do, and consequently can’t backup fast enough, I don’t see how you can predict that future files and situations won’t trigger bugs in a software
The last time I tried emulation on a desktop PC, whether it was Windows or Linux, I had to install each emulator separately. It was a bit of a mess....
Well, i decided to brush up my simple HTML page and created a fully linked wiki on the subject. Please take a look, in the hope it will be useful for at least one fellow one-eyed leg-pegged passionate data hoarder....
A lot of my files were shitty 480p versions of movies from the Napster days. Now they’re all 1080p, with a few 720p exceptions (mainly tv series episodes). All in all 500 something files in total. Now just watching uTorrent slowly download them all. Hopefully my VPN keeps the eyes off of me…
Your question is so generic that it is difficult to reply. I’ll tell you about my use case then so that you can try to figure out yours.
My goal is to be a respectful citizen. I divide my torrents in three categories:
rare stuff: for example project 4k77 or the John Wick regrades or Rashomon
italian stuff: it can be either popular stuff and also rare stuff; italian content is not seeded much so I need to do my part
common/popular stuff: for example the barbie movie or every marvel stuff
I bought tons of space (recently converted to three drives, 20tb each) and use a virtual machine locked behind a vpn. Even if I forget to paid, the virtual machine is bind to the tunnel so that traffic doesn’t go out except for LAN, so no leaks.
The VM has two torrent client:
qbittorrent: seed the torrents in the common / popular categories, speed capped to 1/3 of my bandwidth
transmission (previously using rtorrent) for the other two categories
I tend to leave everything in transmission seeded forever, the stuff in qbittorrent seeded until 2.5 ratio or 4.0 depending on my mood.
At the moment I have 90.2 ration on transmission and many many many TB of uploaded stuff. That should be enough to feel like you are giving back
Hi all home network administrators :) Haven’t posted anything here since June, when I told you about Gatekeeper 1.1.0. Back then it was a pretty bare-bones (and maybe slightly buggy) DNS + DHCP server with a web UI with a list of LAN clients. Back at 1.1.0 Gatekeeper didn’t even configure your LAN interface or set up NAT...
Very interesting project, thanks for sharing and working on this. I am actually one of your target user, where I have enough knowledge to implement my own router, at the moment running on gentoo.
I would like to use this but it lacks port forwarding and a firewall, that is a must. I’ll try it out nevertheless. I’m quite impressed by the stylish HTML graphics, and I appreciate your departure from the typical “modern” gray corporate Bootstrap UI design. It’s really, really cool.
One question. how do you envision exposing this service to the internet? I quite despise rust but I wonder if the use of a memory safe language would help with the inevitable bugs, especially if you put even more features into gatekeeper.
you are literally just posting buzzwords. You can be lean with mysql, you can write bloaty programs with rust. I would argue most rust webservices are shittier than java ones
I got Jellyfin up and running, it’s 10/10. I love this thing, and it reinvigorated my love for watching movies. So I decided to tackle all the other services I wanted, starting with Paperless-ngx…...
My point: if you’re getting started selfhosting you have to embrace and accept the self-inflicted punishment. Good luck everybody, I don’t know if I can keep choosing to get disappointed.
I would say that your self inflicted punishment is using windows. Switch to debian and thank me in six months
What drew you to the high seas?
Inspired by a post since deleted, I feel bad for probably coming off judgemental about the poster’s taste in the movie that drove him to consider sailing....
Godzilla Minus One becomes the most pirated movie in the world - Dexerto (www.dexerto.com)
How do you handle family requests that you disagree with?
Interesting problem here. So I self host jellyfin, happy to share my (owned) movies with my family. Well, my mother has asked me to digitize her collection too and have me host it. Originally, fine, you give your movies to me, I host them, same thing....
[Request] Any Guides to FFMPEG, Transcoding, Codecs, and Metadata?
I have a collection of about ~110 4K Blu-Ray movies that I’ve ripped and I want to take the time to compress and store them for use on a future Jellyfin server....
How should I do backups?
I have a server running Debian with 24 TB of storage. I would ideally like to back up all of it, though much of it is torrents, so only the ones with low seeders really need backed up. I know about the 321 rule but it sounds like it would be expensive. What do you do for backups? Also if anyone uses tape drives for backups I am...
Emulation on Linux
The last time I tried emulation on a desktop PC, whether it was Windows or Linux, I had to install each emulator separately. It was a bit of a mess....
Updated my Gentoo guide to Sailing the High Seas (www.paneburroezucchero.info)
Well, i decided to brush up my simple HTML page and created a fully linked wiki on the subject. Please take a look, in the hope it will be useful for at least one fellow one-eyed leg-pegged passionate data hoarder....
I just deleted my entire library and redownloaded it.
A lot of my files were shitty 480p versions of movies from the Napster days. Now they’re all 1080p, with a few 720p exceptions (mainly tv series episodes). All in all 500 something files in total. Now just watching uTorrent slowly download them all. Hopefully my VPN keeps the eyes off of me…
deleted_by_author
I’ve just released Gatekeeper 1.6.0. It’s a single executable that turns any Linux machine into a home gateway. Now with realtime traffic graphs, LAN autoconfiguration, full cone NAT and better looks. (github.com)
Hi all home network administrators :) Haven’t posted anything here since June, when I told you about Gatekeeper 1.1.0. Back then it was a pretty bare-bones (and maybe slightly buggy) DNS + DHCP server with a web UI with a list of LAN clients. Back at 1.1.0 Gatekeeper didn’t even configure your LAN interface or set up NAT...
GitHub - jeena/fxsync-docker (github.com)
cross-posted from: jemmy.jeena.net/post/185617...
What are some hidden gems and rare pieces of media? Maybe even deleted or lost on all legal distribution platforms?
What media have you found in best quality or even only with the methods of the high seas?...
[classic] Anon punches a ghost (i.redd.it)
It's Hard to Stay Motivated
I got Jellyfin up and running, it’s 10/10. I love this thing, and it reinvigorated my love for watching movies. So I decided to tackle all the other services I wanted, starting with Paperless-ngx…...