Many people have been talking about deleting their #Reddit account and posts because they hate what the Reddit CEO does.
While I welcome a major #RedditExodus (I never posted on Reddit because it's #proprietary), I urge people to #backup their high-quality posts (those that have helped a large number of people) before they delete their account for good.
And then repost your high-quality stuff somewhere else for archival purposes.
"City of Augusta, GA: this is perhaps one of the largest government data thefts in recent years in U.S."
@amvinfe aka #SuspectFile dives into BlackByte's leak of the Augusta, GA data after the attackers encrypted the city's files and backups and then leaked 83 GB of data.
Trotz Maßnahmen gegen Cyber-Angriffe und Ransomware gelingen viele Attacken. Die Daten sind verschlüsselt. Einige Punkte verhelfen zu brauchbaren Backups.
Nicht erst seit gestern; alle, ausnahmslos alle Medien verschlüsseln und Kopien streuen und in der Cloud absichern! Niemand will nach solchen Repressionsmassnahmen ohne digitale Dokumente dastehen. #verschlusselung , #backup
@AAKL Of course, something allowed the #ransomware into your system to begin with, and you also need to plug that hole or you're likely to just end up having to go through the ordeal again after restoring that #backup. To say nothing of the risk of having your data exposed.
Sometimes the simplest things, like promptly installing updates as they become available, can be all that's needed. In other words, basic #security hygiene.
According to reporting by The Register, Richard Addiscott, a senior director analyst at Gartner mentioned these stats in a talk this past week at a conference:
-- Just four percent of ransomware victims recover all their data
-- Only 61 percent recover data at all.
-- Victims typically experience 25 days of disruption to their businesses.
It's not clear to me if that is 61% of victims who pay or 61% of all ransomware victims, but reading the stats in context of the article, I'm thinking that means of those who pay. See what you think.
After any significant change to your #firewall setup, create a #backup of the config. For example, with #OPNsense, go to System > Configuration > Backups > Download configuration. Get into the habit of doing this. Forgetting to do so after happily tinkering away could put a serious dampener on your weekend! 👍 🔥 🛡️ #networking#cybersecurity#admin
Der @KoPPeR wechselt seine Instanz und hat einen Export seiner Daten. Die liegen im JSON Format vor. Hat vielleicht jemand eine Idee, wie man das ganze lesbar anzeigen lassen kann?
Lokal wäre wichtig, ein Onlinereader ist da nicht so vertrauenswert.
I like to keep only my most current emails in my actual email account, everything else goes into the archives. It presents a neater division of materials for my messy mind.
I export all the emails as .eml files occasionally - in case something goes bad with MailStore's db (it's Firebird and maybe I just don't know enough about it, but I don't quite trust it).
Anyone know of a good cloud #backup service? As in, backups for user data on VPSes. I’m rollinng my own at the moment, and it’s getting kind of pricey as the backup volumes are getting larger over time.
Anecdotally I’ve had a ton of WD and Seagate.. even the old IBM « DeathStar » drives over the years and more than my share of failures and data loss. They all go eventually. I’ve had bad luck lately with Seagate usb drives but the issue seems to be the enclosures chasing corruption over the actual drive. Be vigilant and #backup your data
📨 Latest issue of my curated #cybersecurity and #infosec list of resources for week #17/2023 is out! It includes, but not only:
‣ Hackers target vulnerable #Veeam#backup servers exposed online
‣ #FBI queries for Americans’ digital data drops, yet advocates for surveillance reform remain undeterred
‣ #OpenAI: #ChatGPT Back in #Italy After Meeting Watchdog Demands
‣ Many Public #Salesforce Sites are Leaking Private Data
‣ #NIST CSF 2.0 Core discussion draft released, stakeholder feedback invited
‣ #Paperbug Attack: New Politically-Motivated Surveillance Campaign in #Tajikistan
‣ #Linux version of RTM Locker #ransomware targets #VMware ESXi servers
‣ New Atomic #macOS info-stealing #malware targets 50 crypto wallets
‣ #Google Gets Court Order to Take Down #CryptBot That Infected Over 670,000 Computers
‣ #Telegram restricted in #Brazil after refusal to supply user data to authorities
‣ #Cisco discloses XSS zero-day flaw in server management tool
‣ Ukrainian arrested for selling data of 300M people to Russians
‣ Hackers are breaking into AT&T email accounts to steal #cryptocurrency
‣ #Accenture, #IBM, #Mandiant join Elite Cyber Defenders Program to secure critical infrastructure
‣ ATT&CK v13 April Updates
‣ New Data Sharing Platform Serves as Early Warning System for #OTSecurity Threats
‣ North Korean Hackers Target Mac Users With New ‘#RustBucket’ Malware
‣ New All-in-One "#EvilExtractor" Stealer for #Windows Systems Surfaces on the Dark Web
📚 This week's recommended book is: "This Is How They Tell Me the World Ends: The Cyberweapons Arms Race" by Nicole Perlroth
Subscribe to the #newsletter to have it piping hot in your inbox every Sunday ⬇️
#FerretDB 1.0 has been announced
Ferret is a proxy that sits between your #mongodb drivers & a #postgres db converting mongo queries into postgres SQL and using PG for the persistence.
I am finding myself at a complete loss as to any practical reason why someone would want to use this. The ONLY case i can make for it is that it satisfies people who are zealous about open source licenses and aren't happy with Mongo's.
Mongo users could port data into pg using existing #backup & #recovery infrastructure, then they'd have both SQL and mongo query/dump/load options for future #migrations
The pg API will lag mongo's and sunset existing features later. And #SQL api as a fallback
A mongo app could transition to pg incrementally or partially, increasing maintainability by using mongo queries and SQL queries where they make sense
TIL that if you convert from #ext4 to #btrfs that #restic might throw errors about path not found in index, even if check passes. The solution was to restic snapshots then pass --parent with the ID of the most recent snapshot for the path when calling restic backup.
Seriously, though, if you remote #backup your stuff or otherwise use #S3, #restic is definitely a good tool to know about. There's a little learning curve, but it's worth it, IMO.
YSK The Backup 3-2-1 Rule
Why YSK: If you have digital data that is important, (family photos, crypto keys/wallets...). Back it up and prevent permanently losing it....
3-2-1 Backup Rule (www.starwindsoftware.com)
Something i haven't seen posted here yet, but worth say over and over again....