> Compression dictionaries are bits of compressible content known ahead of time. They are being used by compression engines to reduce the size of compressed content.
It’s pretty efficient. A proposal is being designed to support compression dictionary with HTTP. It can be pretty impactful on the Web in terms of network bandwidth and speed.
Today I released rav1ator-cli, a little terminal script that lets you easily interface with Av1an among other things! Read more about it & learn how to install here: https://wiki.x266.mov/docs/utilities/rav1ator-cli
I don't think I have enough followers yet for this to reach the right people, but just in case: if you wear #compression garments for #POTS & it seems to help, how can you tell it's helping? Changes observed in HR, improved symptoms, or just "feels better"? Some other way?
And THAT's why you use compression. This is the language server log from Neovim. It would take 114MB if not for compression which reduced it to a tiny 3.9MB!! Save space, save your SSD lifespan.
Perhaps this was obvious to you, but it wasn't to me. So I'm sharing in the hope that you don't spend an evening trying to trick your webserver into doing something stupid. For years, HTTP content has been served with gzip compression (gz). It's basically the same sort of compression algorithm you get in a […]
Are there any #fediverse platforms that don't compress or others ruin image uploads? #mastodon reduces many images to under 2000 pixels in either direction, usually in the 1600-1800 range from what I've seen.
OFC it won't be even remotely efficient when compared to modern compression...
Even #LZ77 will run circles around it, not to mention #bzip2, #lzma or high-efficiency vocoders like #Codec2....
The last days I researched a bit on HTTP #compression. What do you think about adding pre-compressed versions (#gzip and #brotli) of static assets during the Nextcloud building phase? Is the rise in storage space reasonable? It would reduce CPU usage and transmission size.
Optionally we could use zopfli for gzip to achieve even greater compression (slower).
The story of how David Huffman came up with his lossless compression scheme, due to a challenge by his professor.
"Huffman’s approach has turned out to be so powerful that, today, nearly every lossless compression strategy uses the Huffman insight in whole or in part."
🆕 blog! “Selectively Compressed Images - A Hybrid Format”
I have a screenshot of my phone's screen. It shows an app's user interface and a photo in the middle. Something like this: If I set the compression to be lossy - the photo looks good but the UI looks bad. If I set the compression to be lossless - the UI looks good but […]
If I set the compression to be lossy - the photo looks good but the UI looks bad.
If I set the compression to be lossless - the UI looks good but the filesize is huge.
Is there a way to selectively compress different parts of an image? I know WebP and AVIF are pretty magical but, as I understand it, the whole image is compressed with the same algorithm and the same settings.
There are two ways to do this. The impossible way and the cheating way.
In theory it should be possible to tell an image format to compress some chunks of an image with a different compression algorithm.
And yet... none of the documentation I've found shows that's possible.
GiMP's native XCF and Photoshop's PSD files work; they store different layers each of which can have a different filetype. I understand that TIFF and .djvu also have that capability.
But those sorts of files don't display in web browsers.
That draws the JPG then draws the PNG on top of it. If the PNG has a transparent section, the JPG will show through. The JPG can be set to as low a quality as you like and the PNG remains lossless.
Embedded images are Base 64 encoded, which does lose some of the compression advantages. But, overall, it's smaller than a full PNG and better quality than a full JPG.
Look, if it's stupid but it works it's not stupid.
But surely there must be a way of doing this natively?
🎞️ Quickly resize a video with FFmpeg/Vaapi for Mastodon
— @paulox
「 Mastodon’s limitations on media files that can be uploaded have changed in recent months, but still remain stringent for media produced by some devices.
Here are the new limits:
maximum size: 99 MB
maximum resolution: 3840 x 2160px (4K UHD)
maximum frame rate: 120 fps
allowed extensions: .webm .mp4 .m4v .mov 」