For larger files, the malware generates four of these codes. But due to a programming error, it keeps overwriting each new code with the previous one in the same slot, like writing four different combinations on a single sticky note and keeping only the last one. By the time it’s done, three of the four codes are gone forever. The scrambled data they correspond to is permanently unreadable for the victim, security researchers, and the attackers themselves.
They called me crazy for insisting on ensuring all of our companies files compress to 10,000 separate 64KiB xz compressed files. Well who’s laughing now?!
Best part about doing this is you can get 4 files to a DD 5.25". That, alone, quadruples the speed of performing off-site backups.
Never underestimate the bandwidth of a truck full of disks doing 75 mph on the highway.
That’s a wild bus factor though
You mean if the bus gets hit by a bus?
This is how you drove down probability.
The risks of Steve getting fatally hit by a bus are, statistically, very low.
However, there are even fewer busses getting fatally hit by other busses.
Ipsp facto, lower risk, extra 9s.
That’s risk tolerance 101.
Its not reallt ransomware at that point is it? More like a data terrorist.
Not even malware is safe from vibe coders
When their alpha testers are already ransomware victims, I suppose they figure what’s the harm?
Once again tested backups are the answer. I left the backup software industry many years ago, and I don’t miss at all hearing customer crying because the new software they just bought can’t restore the losses from before they had it.
“We don’t need backups because we’re moving to the cloud” - IT manager whose only technical experience is working on phone switches in the 80s.
God I feel this one in my soul. I’m forever pointing out pretty much cloud storage provider explicitly says they don’t backup your data and to have some sort of backup solution in place and I’m always met with blank stares.
Omg, thats my boss. Everything is in Sync, why do we need physical backups?
The “tested” part is really key. Until you have successfully restored from a backup, it is basically Schrödinger’s Data. Just an amorphous blob of data that may or may not be a good backup. So many companies set up backups to check an item off a list, and then never actually revisit it to confirm those backups are actually working.
It isn’t that key - most backups do work. Backup program creators test that everything works. and there are consultants who can help restore - for a price - in an emergency.
However if you want to restore fast you better have tested the process recently - all the staff needs to have experience in what to do.
If you want to be 100% sure you got everything backed up you need to do a real test as well. That means you regularly tell everyone no working this weekend, leave your computer behind - when you return it will be wiped to factory and restored from backup. I don’t think anyone does this.
Immutable ones, right. Otherwise they can be overwritten, too.
Overwritten is fine when that is intentional. But the best backups do include media that is completely offline and so if there is an issue you can restore to fresh/new uncompromised systems.
ZFS snapshots are great for this - so far they have not been attacked and when they work they give you what the file was before. (you still should have an offline copy of everything stored in a different campus)



