It’s actually surprising how a lot of people in tech aren’t really aware of the security and privacy side of things, especially in their personal lives. They may be more secure at work because the infosec team has oversight, but outside of that I know a lot of very technical people who don’t apply basic security/privacy measures outside of work.
Yeah. I wanted to like submodules, but submodules, to me, ended up feeling like one feature too many in git.
I mainly run into submodules that have been setup accidentally by cloning inside an existing clone. That situation is, of course, not great.
Even for the many reasonable use cases for submodules, I generally end up letting my actual package manager do the work, instead. I’m generally happier for it, since life tends to be simpler when my package manager of choice knows about any required libraries.
My CDN bill recently went from about $5 a month to over $200. Turned out it was Tictok’s spider relentlessly scraping the same content over and over again.
It was ignoring robots.txt. In the end I just had to ban their user agent in the CDN config.
Yea. Along with web rings, human-focused search and just harbouring communities better … we gotta start building people-focused online gardens and ditch this capitalistic hustle shit.
Yeah, I’m wondering about how they characterize “bot activity.” It seems like “any traffic not proximally related to a user’s synchronous activity” is a little too broad.
I’m not sure if fediverse syncing is bot activity. Or my laptop checking for software updates while I’m sleeping. Or my autopay transactions for utility bills.
From the org’s definition of bots, I’d say it’s implicit that bot activity excludes expected communication in an infrastructure, client-server or otherwise. A bot is historically understood as an unexpected, nosy guest poking around a system. A good one might be indexing a website for a search engine. A bad one might be scraping email addresses for spammers.
In any case, none of the examples you give can be reasonably categorized as bots and the full report gives no indication of doing so.
I’d argue that with their definition of bots as “a software application that runs automated tasks over the internet” and later their definition of download bots as “Download bots are automated programs that can be used to automatically download software or mobile apps.”, automated software updates could absolutely be counted as bot activity by them.
Of course, if they count it as such, the traffic generated that way would fall into the 17.3% “good bot” traffic and not in the 30.2% “bad bot” traffic.
Looking at their report, without digging too deep into it, I also find it concerning that they seem to use “internet traffic” and “website traffic” interchangeably.
Firstly, we need to get rid of cameras at every 10m. That is ridiculous! I have seen cameras at a street vendor who supposedly keeps them for “protection”, against what, right to beg?
Each camera is an offense against the customer, you’re not trusting anyone at all!
We have to say these golden words, you poor bastard, do you need Z+ security?
I mean ssh is built into Windows 11 and has been a part of Mac for like the OG Mac release: I can’t remember downloading a Mac ssh client since OS9 and windows has had WSL so chose your flavor of Linux to run and ssh into you jumpbox
They do however see Sponsored search results on google.com and that’s how this attack chain starts. You search for one those tools, get a sponsored result and click it. You’re then whisked away to a spoofed site.
helpnetsecurity.com
Hot