• 0 Posts
  • 49 Comments
Joined 2 months ago
cake
Cake day: November 21st, 2025

help-circle
  • This is one of the difficulties with these releases - assessing the credibility of the information in each document is difficult. In this case, what you’ve just revealed might be obvious to anyone reviewing the actual document, but clearly no one is bothered doing that, and just assumes someone else will.

    As the accusations pile up and no one does anything about any of it, the prevailing assumption is that none of it is reliable - of course the reality is that there must be dozens of identifiable witnesses who have or will provide testimony.

    If I were the Trump Admin’s PR boss, I’d probably be making posts containing fake Epstein documents, just to undermine the credibility of anything claiming to be an Epstein release.









  • Deduplication based on content-defined chunking is used to reduce the number of bytes stored: each file is split into a number of variable length chunks and only chunks that have never been seen before are added to the repository.

    A chunk is considered duplicate if its id_hash value is identical. A cryptographically strong hash or MAC function is used as id_hash, e.g. (hmac-)sha256.

    To deduplicate, all the chunks in the same repository are considered, no matter whether they come from different machines, from previous backups, from the same backup or even from the same single file.

    Compared to other deduplication approaches, this method does NOT depend on:

    • file/directory names staying the same: So you can move your stuff around without killing the deduplication, even between machines sharing a repo.

    • complete files or time stamps staying the same: If a big file changes a little, only a few new chunks need to be stored - this is great for VMs or raw disks.

    • The absolute position of a data chunk inside a file: Stuff may get shifted and will still be found by the deduplication algorithm.

    This is what their docs say. Not sure what you mean about diffferent file types but this seems fairly agnostic?

    I actually didn’t realise that first point, as in you can move folders and the chunks will still be deduplicated.





  • If he were going to physically harm you or your family in any way, he wouldn’t be harassing you on the phone. He’s legit just trying to scare you.

    The best approach here is to “grey rock” him, which in this case just means not answering. You said you don’t use the home phone so just disconnect whatever device. You said your wife has blocked him so that’s sorted. Close your fb account or don’t log in or whatever.

    Call the police and make an actual report to get an actual report number. Here any kind of threat made over a “wire service” is a crime. They may not investigate but that’s not the point, in the infinitessimal chance the guy escalates, you want some history.

    Refresh home security without going nuts. Maybe a ring camera, make sure window locks are installed.

    Then just go about your life, but maybe avoid throw down arguments on fb.





  • My docker files, configs, and volumes are all kept in a structure like:

    /srv  
     - /docker  
     - - /syncthing  
     - - - /compose.yml  
     - - - /sync-volume  
     - - /traefik  
     - - - /compose.yml  
     [...]  
    

    I just backup /srv/docker, but I black list some subfolders for things like databases for which regular dumps are created or something. Currently the compressed / deduplicated repos consume ~350GB.

    I use borgmatic because you do 1 full backup and thereafter everything is incremental, so minimal bandwidth.

    I keep one backup repo on the server itself in /srv/backup - yes this will be prone to failure of that server but it’s super handy to be able to restore from a local repo if you just mess up a configuration or version upgrade or something.

    I keep two other backup repos in two other physical locations, and one repo air gapped.

    For example I rent a server from OVH in a Sydney data centre, there’s one repo in /srv/backup on that server, one on OVH’s storage service, one kept on my home server, and one on a removable drive I update periodically.

    All repo’s are encrypted except for the air gapped one. That one has instructions intended for someone to use if I die or am incapacitated. So it has my master password for my password database, ssh keys, everything. We have a physical safe at home so that’s where that lives.


  • I’m not really confident in this answer but, “not that I’m aware of”.

    I use mxroute as a paid / hosted IMAP & SMTP server. They run spam assassin, but it’s obviously not trained on my own reports.

    I’ve grown fond of Thunderbird as an email client. It’s spam management is clunky but if you spend 15 minutes or so learning how it works, and then train it with both junk and not junk, it works reasonably well.

    Sadly, it does occasionally throw a false positive, like maybe twice in the last year it identified a legit email as spam.

    So, while I’m running a spam assassin and thunderbird combo, it’s really TB that’s doing the work because SA is really just filtering the super low hanging fruit.

    TB is doing a very respectable job, but needs to be trained.