thingsiplay

@thingsiplay@beehaw.org

This profile is from a federated server and may be incomplete. Browse more on the original instance.

thingsiplay, (edited )

Examples of unverified apps:

… these would be hidden by default. Is any of these applications dangerous or a security risk to the system / user?

Linux Mint:

Unverified Flatpaks represent a huge security risk.

I personally don’t like this. This is not really true and in worse case even misleading and giving a false sense of security. If an app represents a huge security risk, why in the first place is it allowed in the repository? Unverified does not mean its a security risk, this is their interpretation of it. Unverified simply means, it is not verified by the original author.

Create a fork of an app and verify your website with the fork in Flatpak. The system is already broken. Another point is, that lot of unverified apps are just normal apps, as this is the way applications are handled in Linux. We have the right to create alternative versions of the programs and the verification badge will show that. There is no point in hiding alternatives. By doing so, it undermines a reason why we use GPL and Open Source. And what about apps where the original author does not care, but was brought to Flatpak by a community member?

Flathub:

It’s similar failure to what Flathub does on their site too, but for another thing.

Potentially unsafe: Full file system read/write access; Can access some specific files

Even though LibreOffice is verified, it is marked as potentially unsafe application on Flathub.

thingsiplay, (edited )

Besides that I don’t like its installation is a curl into bash and then the script itself curls data from internet (from another repository github.com/HorlogeSkynet/thunderbird-user.js); not a good practice. But besides that point that get ignored anyway…

A tip: You can use in a single sed command multiple -e . So you would need to run sed only once and load the file only once (or only a few times). Like this: sed -e ‘s/abc/ABC/g’ -e ‘s/def/DEF/g’ file.ext and you can have each of them on their own line too:


<span style="color:#323232;">sed 
</span><span style="color:#323232;">-i 
</span><span style="color:#323232;">-e </span><span style="color:#183691;">'s/abc/ABC/g' </span><span style="color:#323232;">
</span><span style="color:#323232;">-e </span><span style="color:#183691;">'s/def/DEF/g' </span><span style="color:#323232;">
</span><span style="color:#323232;">user.js
</span>

Why do you have so many huge block of whitespaces, such as 10 empty lines in row, multiple times throughout the script? An why are there so many unused lines of code, commented out? I think you should delete them completely to not confuse people. If you want do additional features people can use optionally, then either create options for that or create another script that can be run if its installed (check if file exists and then run it if user installed it).

These are just a few thoughts I had when looking into the code.

thingsiplay,

curling into a temporary directory and then piping into Bash is effectively the same as the current way. Why not provide a clear instruction for installation and maybe even a separate installation script? Why does the setup script download the hardening script from the web, if its included in the repository anyway?

Here is how I would imagine the install instructions could look like. Git clone command will download all files from the current repo, including the hardening-overwrite script. With bash scriptname the user does not require to use chmod. I would remove the curl from the setup script. Also there is a dedicated install command in Linux.

Inside setup . sh you could use:


<span style="color:#323232;">program</span><span style="font-weight:bold;color:#a71d5d;">=</span><span style="color:#183691;">'thunderbird-hardening-overwrite'
</span><span style="color:#323232;">install -v </span><span style="color:#183691;">"${</span><span style="color:#323232;">program</span><span style="color:#183691;">}" </span><span style="color:#323232;">~/.local/bin/</span><span style="color:#183691;">"${</span><span style="color:#323232;">program</span><span style="color:#183691;">}"
</span>

And the installation instructions in the Readme could look like this:


<span style="color:#323232;">git clone https://github.com/boredsquirrel/thunderbird-hardening-automation
</span><span style="color:#323232;">bash setup.sh
</span>

If people are capable of copying the curl command, then they are capable of copying a few more lines like above.


Ah, I didn’t think about the commenting out stuff. This breaks it. If that is something you want to allow, then this technique wouldn’t work. There is a way to run sed only once, by building a command variable as a Bash array. I am using this technique in my scripts nowadays, but it might look strange for people who don’t know about it. Commenting out lines is possible with arrays. Not sure if you would want do that. In case you want to look at how this looks:


<span style="font-style:italic;color:#969896;"># Base command.
</span><span style="color:#323232;">sed_cmd</span><span style="font-weight:bold;color:#a71d5d;">=</span><span style="color:#323232;">(
</span><span style="color:#323232;">    sed
</span><span style="color:#323232;">    -i
</span><span style="color:#323232;">)
</span><span style="color:#323232;">
</span><span style="font-style:italic;color:#969896;"># Arguments that can be added by condition or excluded with commenting out.
</span><span style="color:#323232;">sed_cmd</span><span style="font-weight:bold;color:#a71d5d;">+=</span><span style="color:#323232;">(-e </span><span style="color:#183691;">'s/abc/ABC/g'</span><span style="color:#323232;">)
</span><span style="color:#323232;">sed_cmd</span><span style="font-weight:bold;color:#a71d5d;">+=</span><span style="color:#323232;">(-e </span><span style="color:#183691;">'s/def/DEF/g'</span><span style="color:#323232;">)
</span><span style="color:#323232;">
</span><span style="font-style:italic;color:#969896;"># Then the last argument that is intended to be added always.
</span><span style="color:#323232;">sed_cmd</span><span style="font-weight:bold;color:#a71d5d;">+=</span><span style="color:#323232;">(user.js)
</span><span style="color:#323232;">
</span><span style="font-style:italic;color:#969896;"># Execute the Bash array as a commandline:
</span><span style="color:#183691;">"${</span><span style="color:#323232;">sed_cmd[@]</span><span style="color:#183691;">}"
</span>

This might look intimidating and I can understand if you pass on this one. But I just wanted bring this to your attention. You might want to experiment before committing to it.

thingsiplay,

Git clone is useful if you want actually keep the source code you downloaded originally. Also I assume people who use this command to get a program, would remove that directory manually after job is done (if they don’t want to keep it). And I am always very careful with rm commands, therefore I do not include them most of the time. It’s not like people would not know how to deal with temporary files they download, just like downloading an archive, unpacking it and removing the archive file as an analogy.

At least this is my way of doing so. I think this transparency is good for the end user, better than “hiding” it behind a curl into bash in my opinion (opinions vary I have noticed in the forums). You could put cd Downloads right before/above git clone command, to remind them its meant to be temporary. But I guess this does not align with the values and philosophy you follow, because you want to have it as simple and distraction free as possible for your user. That’s why the curl into bash in the first place. It’s just a priority thing what you value more.

thingsiplay,

I see. Indeed if this is the way you want to proceed, having individual commands is more appropriate. But the thing is, if something fails, then isn’t it better to fail the entire script, instead proceeding silently without the fail being noticed? It depends, in some cases this can be the desired behavior.

thingsiplay,

Hey, I’m not trying to convince you, just wanted to mention something more to think about. Sometimes fail-safeness is truly the better way. But is it in this case too? I mean if the script fails at once with a single sed command, then it means the file is not manipulated. If you have bunch of sed commands and one or two fails, then you have maybe 90% success commands and a few that did not work. That means the script edited the file in a state that was not intended to be. However, if it is a single command and fails all at once, at least the file is preserved as it is.

I don’t know enough about this project to know whats important and appropriate in your case. I mean if its okay that commands “fail”, then keep it this way.

thingsiplay,

It’s a bit too early and premature. There needs to be an alternative to fax first.

thingsiplay,

RetroArch has a dedicated directory for bios and system files. Usually the Playstation bios goes to “system” folder in RetroArch. It’s described here, you need the correct bios files: docs.libretro.com/library/beetle_psx_hw/

thingsiplay, (edited )

You replied to yourself, therefore I did not get any notification.

Just put the bios files in the system directory. They don’t need to be scanned. If these are the correct files, then RetroArch will pick them up automatically from this place. All you need is to put those 3 bios files with .bin extension to system folder.

Then you can check if the bios files are missing or correct at Settings > Core > Manage Cores > “Sony - PlayStation (Beetle PSX HW)” or at the core you actually use. Scroll down to the list with the exclamation points marks (!) . They tell you which of them are either “Present” or “Missing”.

thingsiplay,

No. All you need to do is put them in the right place. If they are marked as missing, either they are not in the correct folder, not named correctly or are the wrong versions.

Note: You don’t need all files that are marked as missing in this list. There are bios files from PSP and PS3, but you don’t need them. You only need the 3 original PS bios files, 1 for each region: scph5500.bin, scph5501.bin, scph5502.bin . Make sure they are the correct versions. Easiest way to check that is with md5sum . Then compare it to the MD5 listed at documentation link I gave you earlier, which looks like these 490f666e1afb15b7362b406ed1cea246 . Its enough to just compare the first few and last few characters to make sure its correct.

Without getting creepy (I have only good intentions) I looked up your post history to see if you were a Linux user. I can see you was setting up Batocera at some point. It is entirely possible that the system you are using setup a different directory for the RetroArch bios “system” folder. Open RetroArch and lookup Settings > Directory > “System/BIOS” . The very first entry on this list shows where the system directory of RetroArch is setup for you. The Playstation bios files go in that folder.

Can you play any Playstation games? If not, its best to create a log file. I don’t know what platform you are using, here is an overview of how to do that for all platforms you have installed RetroArch on: docs.libretro.com/…/generating-retroarch-logs/

yt-dlp-lemon: Simple wrapper to yt-dlp with only a subset of options. (github.com)

Project name is changed from “ytdl” to “yt-dlp-lemon”, after user “lol” in the comments convinced me. Thank you for the suggestion! Remember to change the directory name at ~/.local/share/ytdl to ~/.local/share/yt-dlp-lemon ....

thingsiplay, (edited )

That’s actually a good point, one that I made myself over other projects in the past. Guess I did not think through this entire thing. The script itself is kind of like what an alias means to be, a shorter and simple alias to a more complex command that already exist. In fact it started and got evolved from that. The script is not a project on its own like yt-dlp being independent from youtube-dl. But I admit this is a bad excuse.

I would like to keep the simple and short to remember ytdl executable name. And if anyone wants to, can still rename it to something else when installing. On the other hand the project title and how it is referred to as has ambiguity to it. Changing the readme and some descriptions and titles is not a big deal (in this early stage), but changing the path to the project comes at a huge cost, meaning the Github link.

thingsiplay, (edited )

You are 100% and completely convinced me. Right now I’m in the process of renaming the project and decided to rename the executable too. All your points are on point and is something I would probably say about other projects as well.

As for why I was reluctant to change project name was not only the Github links, in example in the executable and documentation of older version people downloaded. But also the default ignore file I am using with the script, that is created and read/written at “~/.local/share/ytdl” . It will probably annoy people who already downloaded and using it; and I hate that its doing it. But better I do it now that later.

So thank you for this suggestion and explanation.

Edit: I’m going with yt-dlp-lemon , as in easy peasy, lemon squeezy. Probably not as descriptive as you hoped for, but I don’t think this is bad.

thingsiplay,

Okay?

thingsiplay,

It was not just 6 months later. Original GTA v was released on old gen consoles, 1 entire year later on new gen consoles, then again 1.5 years later (meaning 2.5 years later from original release) it got out for PC. But I expect it will come to PC much faster this time, as every day not releasing it is not earned money. They won’t wait until next generation. GTAv was release quite at the end of the last gen back then, this time around 6 will come out in the middle of the generation. Also the current consoles are closer to PC in hardware and software than before, so a port is easier than with v.

I hope they take their time to upgrade and optimize the PC version. This is a massive game that needs its time. Especially because PC player have higher standards than console players and a more variety in hardware that needs to be tested and ironed out as much as possible. Obviously I’m a user like you and don’t know how much truth to all of this is, but sounds plausible in my opinion.

thingsiplay,

And I am saying, if they want to it, then it should be an addon and not installed by default that cannot be removed.

thingsiplay,

Thank you for the understanding and not getting stuck on the choice of my words. I have to say that local AI is much more acceptable to me and remedies some key points I dislike about AI usage in normal cases. But I don’t like the idea of an AI, as it is a black box and it is not possible to verify. I mean in source code we can look at it, test it, modify it, build it. But with an AI like this, we cannot. There is a lot I don’t like about AI.

But autocomplete question? Well yes off course I use autocomplete for my programming, just not with AI. Only simple autocomplete. And I like that.

thingsiplay,

Why not? I think you are ridiculous. Just for two identical rows to merge them is not a good reason to introduce AI into the system. This could have been done without AI, but if they really want an AI, then it has to be an addon. If its really that simple, then there is no need for an AI.

thingsiplay,

Maybe you should stop insulting people like an idiot.

You asked me why this AI for this simple two line merger should be an addon and I explained to you it should because its an AI. If you have no good argument, then insulting people won’t make you right. You make an dumbass example, I give you an answer and you give me this stupid reply. idiots get blocked.

thingsiplay,

Remember the world is not about only you but also people having disabilities.

Remember the world is not about only for people with disabilities. Secondly, this is a nonsense argument, because this does not “require” Ai. Especially not for every user. If its integrated into Firefox and I cannot remove it, then its very much forced. Why not make an extension for people who need or want it? (nobody needs this)

thingsiplay,

Generative AI will be forced on you, regardless of whether you want it in your life. The entire world is moving in that direction very rapidly.

No. Sad that you don’t know what freedom means. AI is marketing bullshit and the way to control you. Nobody will force me to use it and you know it. Just because you tell me that I have no choice does not take away my choice. You are a Linux user godamn, you should know that better.

Don’t let corporations control you with AI. Don’t be a lemming.

thingsiplay,

Which does not mean that I HAVE TO USE IT ON MY MACHINE! Shift planning, inusurances are not software I am using and it is not MY RESPONSIBILITY. I don’t use Spotify and also that is not an important program such as my Firefox. You guys search for reasons to accept AI and let you control by every company for no reason. All in the name of “accessibility”, when in reality this is not needed.

thingsiplay,

You missed my point entirely. If it is an extension that is installed by default, because a minority needs it, then at least the majority who don’t want it can remove the extension. This is especially more important because it is AI and not a regular program. AI is always a black box that cannot be verified.

And if its too hard for a disabled person to install extension, then its probably too hard to use Firefox in the first place. That’s nonsense argumentation. But that’s not even my actual argumentation and I think you guys try to misunderstand me, just because I don’t like what you like. AI in the browser is bullshit idea, it does not matter if its disabled or not person. And not something “required” as the base minimum, that cannot be removed.

thingsiplay,

You have no idea why I dislike it and create nonsense judgement over my disliking. I think this is a normal human defending position of you, because I dislike something (there are good reasons for) that you like.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • anitta
  • InstantRegret
  • mdbf
  • ngwrru68w68
  • magazineikmin
  • thenastyranch
  • rosin
  • khanakhh
  • osvaldo12
  • Youngstown
  • slotface
  • Durango
  • kavyap
  • DreamBathrooms
  • JUstTest
  • tacticalgear
  • ethstaker
  • provamag3
  • cisconetworking
  • tester
  • GTA5RPClips
  • cubers
  • everett
  • modclub
  • megavids
  • normalnudes
  • Leos
  • lostlight
  • All magazines