• رضا@lemmy.world
    link
    fedilink
    English
    arrow-up
    74
    ·
    3 months ago

    OP is not updating his Arch system regularly.

    his previous update must have been at least two hours old.

  • Buffalox@lemmy.world
    link
    fedilink
    arrow-up
    28
    ·
    3 months ago

    22.8 GiB install size !?
    WTF?

    I must admit I don’t recall the size of my own installation, but that seems HUGE!
    Anyways congratulations on getting it trimmed. 😋

    • silenium_dev@feddit.org
      link
      fedilink
      arrow-up
      21
      ·
      3 months ago

      If you’re doing anything with GPU compute (Blender, AI, simulations etc.), just ROCm, CUDA or oneAPI alone will take up half of that

    • rustydrd@sh.itjust.works
      link
      fedilink
      arrow-up
      9
      ·
      3 months ago

      Larger than my entire root partition (currently at 21GB), but that’s because I made the fatal mistake to limit the partition to 25GB when I set it up. So I have to keep it trim, and I envy OP deep down.

      • Buffalox@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        3 months ago

        Haha I did that once too, because I had a system that when upgrading I wanted a separate home partition so I could just reassign it to my new install.

      • Buffalox@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        3 months ago

        Which distro has Factorio as part of the standard package system?
        Seems like a nice way to save €32,-.

        • Ŝan • 𐑖ƨɤ@piefed.zip
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 months ago

          Oh, I completely skipped þe cost part. You still have to pay for þe game – it just gives you a way to maintain þe installation þrough AUR.

    • u/lukmly013 💾 (lemmy.sdf.org)@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      I probably got something like that. I am not really into minimal installs, kde-applications-meta and plasma-meta is what I go with. Absolutely everything.

      I just wish I could safely use KDE Discover for updates. That’s probably what would work with “apply updates on reboot”, which sounds like the safest option. But for some reason packagekit-qt6 which would (probably) make this possible is not recommended to use.

      Preferably I’d go with something like KDE Neon or Kubuntu. I just really like KDE. But there’s just no sweet spot for me. Arch gives me new packages with all the bugs. Each update feels scary, what will I discover. Based on my Timeshift notes, last point without major bugs was 31st of October. Something like Linux Mint was stable, but I was missing some newer packages, and even drivers when my laptop was new. And major version upgrades also feel scary. Although, I don’t even know how they work. This is where Arch makes more sense to me. Linux as desktop OS is really just a huge bunch of packages working together, and they slowly get updated. When packaged into an entire OS, how do you even define a version?

      • Buffalox@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        3 months ago

        I also use KDE, and it is far from minimal, but as I recall my system is only half that with a full system upgrade!
        Some say creativity stuff takes much room, but for instance Blender is only ½ a gig.

        But maybe my system is bigger than I remember, because even at 40 gig it’s near irrelevant compared to the size of an SSD today, and with 1 gigabit internet the upgrades are fast anyway.

        IDK if there’s a way to see the size of my actual Linux install not counting 3rd party media or games?

  • MangoPenguin@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    2
    ·
    3 months ago

    Download of 6GB is wild, is that re-downloading the entire package for each one that needs an update? Shouldn’t it be more efficient to download only the changes and patch the existing files?

    At this point it seems like my desktop Linux install needs as much space and bandwidth than windows does.

    • Ephera@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 months ago

      This doesn’t work too well for rolling releases, because users will quickly get several version jumps behind.

      For example, let’s say libbanana is currently at version 1.2.1, but then releases 1.2.2, which you ship as a distro right away, but then a few days later, they’ve already released 1.2.3, which you ship, too.
      Now Agnes comes home at the weekend and runs package updates on her system, which is still on libbanana v1.2.1. At that point, she would need the diffs 1.2.1→1.2.2 and then 1.2.2→1.2.3 separately, which may have overlaps in which files changed.

      In principle, you could additionally provide the diff 1.2.1→1.2.3, but if Greg updates only every other weekend, and libbanana celebrates the 1.3.0 release by then, then you will also need the diffs 1.2.1→1.3.0, 1.2.2→1.3.0 and 1.2.3→1.3.0. So, this strategy quickly explodes with the number of different diffs you might need.

      At that point, just not bothering with diffs and making users always download the new package version in full is generally preferred.

      • MangoPenguin@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 months ago

        Interesting, it wouldn’t work like rsync where it compares the new files to the old ones and transfers the parts that have changed?

        • Ephera@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 months ago

          Hmm, good question. I know of one such implementation, which is Delta RPM, which works the way I described it.
          But I’m not sure, if they just designed it to fit into the current architecture, where all their mirrors and such were set up to deal with package files.

          I could imagine that doing it rsync-style would be really terrible for server load, since you can’t really cache things at that point…

          • MangoPenguin@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            2
            ·
            3 months ago

            Yeah I guess these days the majority of users have fast enough connections that its not worth it. It sucks if you have crappy internet though hah.