There were a number of exciting announcements from Apple at WWDC 2024, from macOS Sequoia to Apple Intelligence. However, a subtle addition to Xcode 16 — the development environment for Apple platforms, like iOS and macOS — is a feature called Predictive Code Completion. Unfortunately, if you bought into Apple’s claim that 8GB of unified memory was enough for base-model Apple silicon Macs, you won’t be able to use it. There’s a memory requirement for Predictive Code Completion in Xcode 16, and it’s the closest thing we’ll get from Apple to an admission that 8GB of memory isn’t really enough for a new Mac in 2024.

  • maxinstuff@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    3 months ago

    Oh man, I remember so many people defended 8GB since the M1 first came out (and since).

    I always argued it would significantly reduce the lifetimes of these machines if you bought one, not just because you’d be swapping a lot more on the (soldered in BTW) ssd, but because after a few years of updates it would become unbearably slow, or hardware would fail, or both.

    Didn’t stop people constantly “tHe aRchITecTuRE iS cOmPlETelY diFFeRenT!!!”

    Sure it’s different, but it’s still just a computer. A technical person can still look at the spec sheet and calculate effective performance accounting for bus widths etc.

    Disclosure: I bought a top spec 16GB M1 Mac Air on launch and have been extremely happy with it - it’s still going strong.

    • uis@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      edit-2
      3 months ago

      Didn’t stop people constantly “tHe aRchITecTuRE iS cOmPlETelY diFFeRenT!!!”

      Different Turing Machine on different math and alternative physics, I guess.

      I bought a top spec 16GB M1 Mac Air on launch

      My condolences.

      EDIT: do people geuenly belive that math doesn’t apply to Apple’s products or they just don’t understand even such concentrated sarcasm?

  • Hux@lemmy.ml
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 months ago

    This isn’t a big deal.

    If you’re developing in Xcode, you did not buy an 8GB Mac in the last 10-years.

    If you are just using your Mac for Facebook and email, I don’t think you know what RAM is.

    If you know what RAM is, and you bought an 8GB Mac in the last 10-years, then you are likely self-aware of your limited demands and/or made an informed compromise.

  • Jtee@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 months ago

    And now all the fan boys and girls will go out and buy another MacBook. That’s planned obsolescence for ya

    • bamboo@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 months ago

      Someone who is buying a MacBook with the minimum specs probably isn’t the same person that’s going to run out and buy another one to get one specific feature in Xcode. Not trying to defend Apple here, but if you were a developer who would care about this, you probably would have paid for the upgrade when you bought it in the first place (or couldn’t afford it then or now).

      • TheGrandNagus@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        Well no, not this specific scenario, because of course devs will generally buy machines with more RAM.

        But there are definitely people who will buy an 8GB Apple laptop, run into performance issues, then think “oh I must need to buy a new MacBook”.

        If Apple didn’t purposely manufacture ewaste-tier 8GB laptops, that would be minimised.

    • m-p{3}@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      And why they solder the RAM, or even worse make it part of the SoC.

      • rockSlayer@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        There are real world performance benefits to ram being as close as possible to the CPU, so it’s not entirely without merit. But that’s what CAMM modules are for.

        • akilou@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          But do those benefits outweigh doubling or tripling the amount of RAM by simply inserting another stick that you can buy for dozens of dollars?

          • BorgDrone@lemmy.one
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 months ago

            Yes, there are massive advantages. It’s basically what makes unified memory possible on modern Macs. Especially with all the interest in AI nowadays, you really don’t want a machine with a discrete GPU/VRAM, a discrete NPU, etc.

            Take for example a modern high-end PC with an RTX 4090. Those only have 24GB VRAM and that VRAM is only accessible through the (relatively slow) PCIe bus. AI models can get really big, and 24GB can be too little for the bigger models. You can spec an M2 Ultra with 192GB RAM and almost all of it is accessible by the GPU directly. Even better, the GPU can access that without any need for copying data back and forth over the PCIe bus, so literally 0 overhead.

            The advantages of this multiply when you have more dedicated silicon. For example: if you have an NPU, that can use the same memory pool and access the same shared data as the CPU and GPU with no overhead. The M series also have dedicated video encoder/decoder hardware, which again can access the unified memory with zero overhead.

            For example: you could have an application that replaces the background on a video using AI. It takes a video, decompresses it using the video decoder , the decompressed video frames are immediately available to all other components. The GPU can then be used to pre-process the frames, the NPU can use the processed frames as input to some AI model and generate a new frame and the video encoder can immediately access that result and compress it into a new video file.

            The overhead of just copying data for such an operation on a system with non-unified memory would be huge. That’s why I think that the AI revolution is going to be one of the driving factors in killing systems with non-unified memory architectures, at least for end-user devices.

          • gravitas_deficiency@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 months ago

            It’s highly dependent on the application.

            For instance, I could absolutely see having certain models with LPCAMM expandability as a great move for Apple, particularly in the pro segment, so they’re not capped by whatever they can cram into their monolithic SoCs. But for most consumer (that is, non-engineer/non-developer users) applications, I don’t see them making it expandable.

            Or more succinctly: they should absolutely put LPCAMM in the next generation of MBPs, in my opinion.

    • Mongostein@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      And the apple haters will keep making this exact same comment on every post using their 3rd laptop in ten years while I’m still using my 2014 MacBook daily with no issues.

      Be more original.

      • Jtee@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        Nice attempt to justify planned obsolescence. To think apple hasn’t done this time and time again, you’d have to be a fool

        • Mongostein@lemmy.ca
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          3 months ago

          👍

          -posted from my ten year old MacBook which shows no need for replacement

          • Honytawk@lemmy.zip
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 months ago

            At which point did Apple decide your MacBook was too old to be usable and stop giving updates or allow new software to run on it?

            • Mongostein@lemmy.ca
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              3 months ago

              Still gets security updates. All the software I need to run on it runs on it.

              My email, desktop, and calendar all still sync with my newer desktop. I can still play StarCraft. I can join zoom meetings while running Roll 20. I can even run Premiere and do video editing… to a point.

              I guess if you need the latest and greatest then you might have a point, but I don’t.

              This whole thread is bitching about software bloat and Apple does that to stop the software bloat on older machines, but noooo that’s planned obsolescence. 🙄

  • resetbypeer@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 months ago

    Opens chrome on a 8GB Mac. Sees lifespan of SSD being reduced by 50%. After 2-3 years of heavy usage SSD starts to get errors. Apple solution: buy a new one. No wonder they are 2nd/3rd wealthiest company on the planet.

  • egeres@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    3 months ago

    Why do they struggle so much with some “obvious things” sometimes ? We wouldn’t have a type-C iphone if the EU didn’t pressured them to do make the switch

    • helenslunch@feddit.nl
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      They don’t “struggle”. They are intentional and malicious decisions meant to drive revenue, as they have been since the beginning.

  • RecluseRamble@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    I can’t believe, there’s no Linux reference yet!

    Give your “8 gigs not enough” hardware to one of us and see it revived running faster than whatever you’re running now with your subpar OS.

  • kingthrillgore@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    They moved to on-die RAM for a reason: To nickel and dime yo ass.

    I needed to expense a Mac Mini for iOS development, and everyone (Me, the company, our purchasing department) was baffled at how much it cost to get 16 GB. And they only go up to 24GB. Imagine how much they’ll charge for 32 in a year!

    • stoly@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Mac Mini is meant to be sort of the starter desktop. For higher end uses, they want you on the Mac Studio, an iMac, or a Mac Pro.

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        3 months ago

        It’s a bit first but if their primary motivation was performance improvements they wouldn’t be soldering 16 GB.

        If you’re going to weld shoes to your feet, you better at least make sure that they’re good shoes.

          • Echo Dot@feddit.uk
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            3 months ago

            Yeah but if you’re only putting 8 GB of RAM on then you’re also going to be constantly querying the hard drive. So any performance gain you get from soldering, is lost by going all the way to the hard drive every 3 microseconds.

            It’s only better performance on paper in reality there’s no real benefit. If you can run an application entirely entirely within the 8 GB of RAM, and assuming you’re not running anything else, then maybe you get better performance.

            • sugar_in_your_tea@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              0
              ·
              3 months ago

              And that’s the idea. Soldering memory is an engineering decision. How much to solder is a marketing decision. Since users can’t easily add more, marketing can upsell on more RAM.

              It’s not “on paper,” the RAM itself is performing better vs socketed RAM. Whether the system runs better depends on the configuration, as in, did you order enough RAM.

              • Echo Dot@feddit.uk
                link
                fedilink
                English
                arrow-up
                0
                ·
                3 months ago

                I can’t tell if you’re a stooge or if you really think that. I hope you are stooge, because otherwise that’s a really stupid position you’ve decided to take and you clearly don’t actually understand the issue.

                • sugar_in_your_tea@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  3 months ago

                  I’m pretty sure I do understand the issue. Here are some facts (and an article to back it up):

                  1. putting memory closer to the CPU improves performance due to less latency - from 96GB/s -> 200 (M1) or 400 (M1 Max) GB/s
                  2. customers can’t easily solder on more RAM
                  3. Apple’s RAM upgrades are way more expensive than socketed options on the market

                  And here’s my interpretation/guesses:

                  1. marketing sees 1 & 2, and sees an opportunity to do more of 3
                  2. marketing probably asked engineering what the bare minimum is, and they probably said 8GB (assuming web browsing and whatnot only), though 16GB is preferable (that’s what I’d answer)
                  3. marketing sets the minimum @ 8GB, banking on most users who need more than the basics to buy more, or for users to buy another laptop sooner when they realize they ran out of RAM (getting after-sale RAM upgrades is expensive)

                  So:

                  • using soldered RAM is an engineering decision due to improved performance (double socketed RAM w/ Intel on M1, quadruple on M1 Max)
                  • limiting RAM to 8GB is a marketing decision
                  • if you don’t have enough RAM, that doesn’t mean the RAM isn’t performing well, it means you don’t have enough RAM

                  Using socketed RAM won’t fix performance issues related to running out of RAM, that issue is the same regardless. Only adding RAM will fix those performance issues, and Apple could just as easily make “special” RAM so you can’t buy socketed RAM on the regular market anyway (e.g. they’d need a different memory standard anyway due to Unified Memory).

                  I have hated Apple’s memory pricing for decades now, it has always been way more expensive to add RAM to an Apple device at order time vs PC competitors (I still add my own RAM to laptops, but it’s usually way cheaper through HP, Lenovo, etc than Apple at build-time). I’m not defending them here, I’m merely saying that the decision to use soldered RAM makes a lot of engineering sense, especially with the new Unified Memory architecture they’re using in the M-series devices.

  • seb@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    I have a macbook air m2 with 8gb of ram and I can even run ollama, never had ram problems, I don’t get all the hate

    • sverit@lemmy.mlOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Which model with how many parameters du you use in ollama? With 8GB you should only be able to use the smallest models, which ist faaaar from ideal:

      You should have at least 8 GB of RAM available to run the 7B models, 16 GB to run the 13B models, and 32 GB to run the 33B models.

      • seb@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        3 months ago

        llama3:8b, I know it’s “far from ideal” but only really specific use cases require more advanced models to run locally, if you do software development, graphic design or video editing 8gb is enough

        edit: just tried it after some time and it works better than I remembered showcase

  • _number8_@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    imagine showing this post to someone in 1995

    shit has gotten too bloated these days. i mean even in my head 8GB still sounds like ‘a lot’ of RAM and 16GB feels extravagant

    • mycodesucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      Absolutely.

      Bad, rushed software that wires together 200 different giant libraries just to use a fraction of them and then run it in a sandboxed container with three daemons it needs for some reason doesn’t mean “8 Gb isn’t enough”, it means write tighter, better software.

    • rottingleaf@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      I still can’t fully accept that 1GB is not normal, 2GB is not very good, and 4GB is not all you ever gonna need.

      If only it got bloated for some good reasons.

      • Aux@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        3 months ago

        High quality content is the reason. Sit in a terminal and your memory usage will be low.

        • lastweakness@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 months ago

          So we’re just going to ignore stuff like Electron, unoptimized assets, etc… Basically every other known problem… Yeah let’s just ignore all that

          • Aux@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            1
            ·
            3 months ago

            Is Electron that bad? Really? I have Slack open right now with two servers and it takes around 350MB of RAM. Not that bad, considering that every other colleague thinks that posting dumb shit GIFs into work chats is cool. That’s definitely nowhere close to Firefox, Chrome and WebStorm eating multiple gigs each.

            • lastweakness@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              3 months ago

              Yes, it really is that bad. 350 MBs of RAM for something that could otherwise have taken less than 100? That isn’t bad to you? And also, it’s not just RAM. It’s every resource, including CPU, which is especially bad with Electron.

              I don’t really mind Electron myself because I have enough resources. But pretending the lack of optimization isn’t a real problem is just not right.

              • Aux@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                arrow-down
                1
                ·
                3 months ago

                First of all, 350MB is a drop in a bucket. But what’s more important is performance, because it affects things like power consumption, carbon emissions, etc. I’d rather see Slack “eating” one gig of RAM and running smoothly on a single E core below boost clocks with pretty much zero CPU use. That’s the whole point of having fast memory - so you can cache and pre-render as much as possible and leave it rest statically in memory.

                • Verat@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  edit-2
                  3 months ago

                  When (according to about:unloads) my average firefox tab is 70-230MB depending on what it is and how old the tab is (youtube tabs for example bloat up the longer they are open), a chat app using over 350 is a pretty big deal

                  just checked, my firefox is using 4.5gb of RAM, while telegram is using 2.3, while minimized to the system tray, granted Telegram doesnt use electron, but this is a trend across lots of programs and Electron is a big enough offender I avoid apps using it. When I get off shift I can launch discord and check it too, but it is usually bad enough I close it entirely when not in use

                • jas0n@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  edit-2
                  3 months ago

                  Just wanted to point out that the number 1 performance blocker in the CPU is memory. In the general case, if you’re wasting memory, you’re wasting CPU. These two things really cannot be talked about in isolation.

      • Honytawk@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        3 months ago

        The moment you use a file that is bigger than 1GB, that computer will explode.

        Some of us do more than just browse Lemmy.

        • rottingleaf@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 months ago

          Wow. Have you ever considered how people were working with files bigger than total RAM they had in the normal days of computing?

          So in your opinion if you have 2GB+ of a log file, editing it you should have 2GB RAM occupied?

          I just have no words, the ignorance.

    • 31337@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      I wonder what is the general use for the Mac Mini, MacBook Air, iMac, and MacBook Pro? People generally seem to do all the lightweight stuff like social media consumption on their phones; and desktops/laptops are used for the more heavy-weight stuff. The only reason I’ve ever used a Mac was for IOS development.

      • vermyndax@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        I have a friend who is self-employed. He uses an iPhone and a MacBook Air. He only uses iMessage, Numbers, Safari and Apple Music for entertainment. He gets away with 8gb just fine and rarely has to reboot.

        He probably could use a Chromebook or something even lighter, but the support and ecosystem were enough for him to pay the premium. His time is valuable to him so it was worth it to him.