There were a number of exciting announcements from Apple at WWDC 2024, from macOS Sequoia to Apple Intelligence. However, a subtle addition to Xcode 16 — the development environment for Apple platforms, like iOS and macOS — is a feature called Predictive Code Completion. Unfortunately, if you bought into Apple’s claim that 8GB of unified memory was enough for base-model Apple silicon Macs, you won’t be able to use it. There’s a memory requirement for Predictive Code Completion in Xcode 16, and it’s the closest thing we’ll get from Apple to an admission that 8GB of memory isn’t really enough for a new Mac in 2024.

  • Jtee@lemmy.world
    link
    fedilink
    English
    arrow-up
    126
    arrow-down
    29
    ·
    11 days ago

    And now all the fan boys and girls will go out and buy another MacBook. That’s planned obsolescence for ya

    • bamboo@lemm.ee
      link
      fedilink
      English
      arrow-up
      52
      arrow-down
      4
      ·
      11 days ago

      Someone who is buying a MacBook with the minimum specs probably isn’t the same person that’s going to run out and buy another one to get one specific feature in Xcode. Not trying to defend Apple here, but if you were a developer who would care about this, you probably would have paid for the upgrade when you bought it in the first place (or couldn’t afford it then or now).

      • TheGrandNagus@lemmy.world
        link
        fedilink
        English
        arrow-up
        20
        arrow-down
        7
        ·
        11 days ago

        Well no, not this specific scenario, because of course devs will generally buy machines with more RAM.

        But there are definitely people who will buy an 8GB Apple laptop, run into performance issues, then think “oh I must need to buy a new MacBook”.

        If Apple didn’t purposely manufacture ewaste-tier 8GB laptops, that would be minimised.

        • narc0tic_bird@lemm.ee
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          edit-2
          10 days ago

          I wouldn’t be so sure. I feel like many people would not buy another MacBook if it were to feel a lot slower after just a few years.

          This feels like short term gains vs. long term reputation.

    • m-p{3}@lemmy.ca
      link
      fedilink
      English
      arrow-up
      27
      arrow-down
      7
      ·
      11 days ago

      And why they solder the RAM, or even worse make it part of the SoC.

      • rockSlayer@lemmy.world
        link
        fedilink
        English
        arrow-up
        48
        arrow-down
        2
        ·
        11 days ago

        There are real world performance benefits to ram being as close as possible to the CPU, so it’s not entirely without merit. But that’s what CAMM modules are for.

        • akilou@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          21
          arrow-down
          2
          ·
          11 days ago

          But do those benefits outweigh doubling or tripling the amount of RAM by simply inserting another stick that you can buy for dozens of dollars?

          • rockSlayer@lemmy.world
            link
            fedilink
            English
            arrow-up
            18
            arrow-down
            2
            ·
            11 days ago

            That’s extremely dependent on the use case, but in my opinion, generally no. However CAMM has been released as an official JEDEC interface and does a good job at being a middle ground between repairability and speed.

            • halcyoncmdr@lemmy.world
              link
              fedilink
              English
              arrow-up
              17
              arrow-down
              2
              ·
              11 days ago

              It’s an officially recognized spec, so Apple will ignore it as long as they can. Until they can find a way to make money from it or spin marketing as if it’s some miraculous new invention of theirs, for something that should just be how it’s done.

          • BorgDrone@lemmy.one
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            2
            ·
            10 days ago

            Yes, there are massive advantages. It’s basically what makes unified memory possible on modern Macs. Especially with all the interest in AI nowadays, you really don’t want a machine with a discrete GPU/VRAM, a discrete NPU, etc.

            Take for example a modern high-end PC with an RTX 4090. Those only have 24GB VRAM and that VRAM is only accessible through the (relatively slow) PCIe bus. AI models can get really big, and 24GB can be too little for the bigger models. You can spec an M2 Ultra with 192GB RAM and almost all of it is accessible by the GPU directly. Even better, the GPU can access that without any need for copying data back and forth over the PCIe bus, so literally 0 overhead.

            The advantages of this multiply when you have more dedicated silicon. For example: if you have an NPU, that can use the same memory pool and access the same shared data as the CPU and GPU with no overhead. The M series also have dedicated video encoder/decoder hardware, which again can access the unified memory with zero overhead.

            For example: you could have an application that replaces the background on a video using AI. It takes a video, decompresses it using the video decoder , the decompressed video frames are immediately available to all other components. The GPU can then be used to pre-process the frames, the NPU can use the processed frames as input to some AI model and generate a new frame and the video encoder can immediately access that result and compress it into a new video file.

            The overhead of just copying data for such an operation on a system with non-unified memory would be huge. That’s why I think that the AI revolution is going to be one of the driving factors in killing systems with non-unified memory architectures, at least for end-user devices.

          • gravitas_deficiency@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 days ago

            It’s highly dependent on the application.

            For instance, I could absolutely see having certain models with LPCAMM expandability as a great move for Apple, particularly in the pro segment, so they’re not capped by whatever they can cram into their monolithic SoCs. But for most consumer (that is, non-engineer/non-developer users) applications, I don’t see them making it expandable.

            Or more succinctly: they should absolutely put LPCAMM in the next generation of MBPs, in my opinion.

        • TheGrandNagus@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          11 days ago

          Apple’s SoC long predates CAMM.

          Dell first showed off CAMM in 2022, and it only became JEDEC standardised in December 2023.

          That said, if Dell can create a really good memory standard and get JEDEC to make it an industry standard, so can Apple. They just chose not to.

      • Balder@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        edit-2
        10 days ago

        In this particular case the RAM is part of the chip as an attempt to squeeze more performance. Nowadays, processors have become too fast but it’s useless if the rest of the components don’t catch up. The traditional memory architecture has become a bottleneck the same way HDDs were before the introduction of SSDs.

        You’ll see this same trend extend to Windows laptops as they shift to Snapdragon processors too.

        • stoly@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          10 days ago

          People do like to downplay this, but SoC is the future. There’s no way to get performance over a system bus anymore.

        • umami_wasabi@lemmy.ml
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          3
          ·
          11 days ago

          Well. The claim they made still holds true, despit how I dislike this design choice. It is faster, and more secure (though attacks on NAND chips are hard and require high skill levels that most attacker won’t posses).

          And add one more: it saves power when using LPDDR5 rather DDR5. To a laptop that battery life matters a lot, I agree that’s important. However, I have no idea how much standby or active time it gain by using LPDDR5.

    • Mongostein@lemmy.ca
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      10 days ago

      And the apple haters will keep making this exact same comment on every post using their 3rd laptop in ten years while I’m still using my 2014 MacBook daily with no issues.

      Be more original.

      • stoly@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        10 days ago

        This is pretty much it. People really just want to find reasons to hate Apple over the past 2 - 3 years. You’re right, though, your Mac can run easily for 10+ years. You’re good basically until the web browsers no longer support your OS version, which is more in the 12-15 year range.

      • Jtee@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 days ago

        Nice attempt to justify planned obsolescence. To think apple hasn’t done this time and time again, you’d have to be a fool

    • Lucidlethargy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      10 days ago

      These were obsolete the minute they were made, though… So it’s not really planned obsolescence. I got one for free (MacBook Air), and it’s always been trash.

  • Hux@lemmy.ml
    link
    fedilink
    English
    arrow-up
    92
    arrow-down
    14
    ·
    11 days ago

    This isn’t a big deal.

    If you’re developing in Xcode, you did not buy an 8GB Mac in the last 10-years.

    If you are just using your Mac for Facebook and email, I don’t think you know what RAM is.

    If you know what RAM is, and you bought an 8GB Mac in the last 10-years, then you are likely self-aware of your limited demands and/or made an informed compromise.

    • filister@lemmy.world
      link
      fedilink
      English
      arrow-up
      39
      arrow-down
      5
      ·
      edit-2
      11 days ago

      If you know what RAM is, and you bought an 8GB Mac in the last 10-years, then you are likely self-aware of your limited demands and/or made an informed compromise.

      Or you simply refuse to pay $200+ to get a proper machine. Like seriously, 8GB Mac’s should have disappeared long ago, but nope, Apple stick to them with their planned obsolescence tactics on their hardware, and stubbornly refusing to admit that in 2023 releasing a MacBook with soldered 8Gb of RAM is wholy inadequate.

    • DJDarren@thelemmy.club
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      4
      ·
      11 days ago

      I’m not gonna stand up and declare that 8gb is absolutely fine, because in very short order it won’t be. But yeah, currently for an average use case, it is.

      My work Mac mini has 8gb. It’s a 2014 so can’t be upgraded, but for the tasks I ask of it it’s ok. Sure, it gets sluggish if I’m using the Win11 VM I sometimes need, but generally I don’t really have any issues doing regular office tasks.

      That said, I sometimes gets a bee in my bonnet about it, so open Activity Monitor to see what’s it’s doing, and am shocked by how much RAM some websites consume in open tabs in Safari.

      8gb is generally ok on low end gear, but devs are working very hard to ensure that it’s not.

    • stoly@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 days ago

      Funny: knowing that you only get one shot, I bought 32GB of RAM for my Mac Mini like 1.5 years ago. I figured that it gave me the best shot of keeping it usable past 5 years.

  • _number8_@lemmy.world
    link
    fedilink
    English
    arrow-up
    72
    arrow-down
    6
    ·
    11 days ago

    imagine showing this post to someone in 1995

    shit has gotten too bloated these days. i mean even in my head 8GB still sounds like ‘a lot’ of RAM and 16GB feels extravagant

    • rottingleaf@lemmy.zip
      link
      fedilink
      English
      arrow-up
      21
      ·
      10 days ago

      I still can’t fully accept that 1GB is not normal, 2GB is not very good, and 4GB is not all you ever gonna need.

      If only it got bloated for some good reasons.

      • Aux@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        9
        ·
        10 days ago

        High quality content is the reason. Sit in a terminal and your memory usage will be low.

        • rottingleaf@lemmy.zip
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          3
          ·
          10 days ago

          256MB or 512MB was fine for high-quality content in 2002, what was that then.

          Suppose the amount of pixels and everything quadrupled - OK, then 2GB it is.

          But 4GB being not enough? Do you realize what 4GB is?

          • lastweakness@lemmy.world
            link
            fedilink
            English
            arrow-up
            8
            ·
            10 days ago

            They didn’t just quadruple. They’re orders of magnitude higher these days. So content is a real thing.

            But that’s not what’s actually being discussed here, memory usage these days is much more of a problem caused by bad practices rather than just content.

            • rottingleaf@lemmy.zip
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              10 days ago

              I know. BTW, if something is done in an order of magnitude less efficient way than it could and it did, one might consider it a result of intentional policy aimed at neutering development. Just not clear whose. There are fewer corporations affecting this than big governments, and those are capable of reaching consensus from time to time. So not a conspiracy theory.

          • Aux@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            3
            ·
            10 days ago

            One frame for a 4K monitor takes 33MB of memory. You need three of them for triple buffering used back in 2002, so half of your 256MB went to simply displaying a bloody UI. But there’s more! Today we’re using viewport composition, so the more apps you run, the more memory you need just to display the UI. Now this is what OS will use to render the final result, but your app will use additional memory for high res icons, fonts, photos, videos, etc. 4GB today is nothing.

            I can tell you an anecdote. My partner was making a set of photo collages, about 7 art works to be printed in large format (think 5m+ per side). So 7 photo collages with source material saved on an external drive took 500 gigs. Tell me more about 256MB, lol.

            • rottingleaf@lemmy.zip
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              3
              ·
              edit-2
              10 days ago

              Yes, you wouldn’t have 4K in 2002.

              4GB today is nothing.

              My normal usage would be kinda strained with it, but possible.

              $ free -h
                             total        used        free      shared  buff/cache   available
              Mem:            17Gi       3,1Gi        11Gi       322Mi       3,0Gi        14Gi
              Swap:          2,0Gi          0B       2,0Gi
              $ 
              
        • lastweakness@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          2
          ·
          10 days ago

          So we’re just going to ignore stuff like Electron, unoptimized assets, etc… Basically every other known problem… Yeah let’s just ignore all that

          • Aux@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            6
            ·
            10 days ago

            Is Electron that bad? Really? I have Slack open right now with two servers and it takes around 350MB of RAM. Not that bad, considering that every other colleague thinks that posting dumb shit GIFs into work chats is cool. That’s definitely nowhere close to Firefox, Chrome and WebStorm eating multiple gigs each.

            • lastweakness@lemmy.world
              link
              fedilink
              English
              arrow-up
              7
              arrow-down
              1
              ·
              10 days ago

              Yes, it really is that bad. 350 MBs of RAM for something that could otherwise have taken less than 100? That isn’t bad to you? And also, it’s not just RAM. It’s every resource, including CPU, which is especially bad with Electron.

              I don’t really mind Electron myself because I have enough resources. But pretending the lack of optimization isn’t a real problem is just not right.

              • Aux@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                5
                ·
                10 days ago

                First of all, 350MB is a drop in a bucket. But what’s more important is performance, because it affects things like power consumption, carbon emissions, etc. I’d rather see Slack “eating” one gig of RAM and running smoothly on a single E core below boost clocks with pretty much zero CPU use. That’s the whole point of having fast memory - so you can cache and pre-render as much as possible and leave it rest statically in memory.

                • lastweakness@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  ·
                  10 days ago

                  CPU usage is famously terrible with Electron, which i also pointed out in the comment you’re replying to. But yes, having multiple chromium instances running for each “app” is terrible

                • Verat@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  10 days ago

                  When (according to about:unloads) my average firefox tab is 70-230MB depending on what it is and how old the tab is (youtube tabs for example bloat up the longer they are open), a chat app using over 350 is a pretty big deal

                  just checked, my firefox is using 4.5gb of RAM, while telegram is using 2.3, while minimized to the system tray, granted Telegram doesnt use electron, but this is a trend across lots of programs and Electron is a big enough offender I avoid apps using it. When I get off shift I can launch discord and check it too, but it is usually bad enough I close it entirely when not in use

    • yeehaw@lemmy.ca
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      3
      ·
      10 days ago

      I chalk it up to lazy rushed development. Good code is art.

      • Aux@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        5
        ·
        10 days ago

        That’s not true at all. The code doesn’t take much space. The content does. Your high quality high res photos, 4K HDR videos, lossless 96kHz audio, etc.

        • yeehaw@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          10 days ago

          But there are lots of shortcuts now. Asset packs and coding environments that come bundled with all kinds of things you don’t need. People import packages that consume a lot of space to use one tiny piece of it.

          To be clear, I’m not talking about videos and images. You’d have these either way.

          • Aux@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 days ago

            All these packages don’t take much memory. Also tree shaking is a thing. For example, one of the projects I currently work on has over 5 gigs of dependencies, but once I compile it for production, the whole code based is mere 3 megs and that’s including inlined styles and icons. The code itself is pretty much non-existent.

            On the other hand I have 100KB of text translations just for the English language alone. Because there’s shit loads of text. And over 100MB of images, which are part of the build. And then there’s a remote storage with gigabytes of documents.

            Even if I double the code base by copy pasting it will be a drop in a bucket.

    • Bjornir@programming.dev
      link
      fedilink
      English
      arrow-up
      13
      ·
      10 days ago

      I have a VPS that uses 1GB of RAM, it has 6-7 apps running in docker containers which isn’t the most ram efficient method of running apps.

      A light OS really helps, plus the most used app that uses a lot of RAM actually reduce their consumption if needed, but use more when memory is free, the web browser. On one computer I have chrome running with some hundreds of MB used, instead of the usual GBs because RAM is running out.

      So it appears that memory is full,but you can actually have a bit more memory available that is “hidden”

      • derpgon@programming.dev
        link
        fedilink
        English
        arrow-up
        8
        ·
        10 days ago

        Same here. When idle, the apps basically consume nothing. If they are just a webserver that calls to some PHP script, it basically takes no RAM at all when idle, and some RAM when actually used.

        Websites and phone apps are such an unoptimized pieces if garbage that they are the sole reason for high RAM requirements. Also lots of background bloatware.

      • Specal@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        10 days ago

        This is resource reservation, it happens at an OS level. If chrome is using what appears to be alot of ram, it will be freed up once either the OS or another application requires it.

        It just exists so that an application knows that if it needs that resource it can use X amount for now.

    • Aux@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      4
      ·
      10 days ago

      You can always switch to a text based terminal and free up your memory. Just don’t compain that YouTube doesn’t play 4K videos anymore.

    • Shadywack@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 days ago

      We measure success by how many GB’s we have consumed when the only keys depressed from power on to desktop is our password. This shit right here is the real issue.

    • stoly@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 days ago

      You just have to watch your favorite tablet get slower year after year to understand that a lot of this is artificial. They could make applications that don’t need those resources but would never do so.

  • resetbypeer@lemmy.world
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    2
    ·
    10 days ago

    Opens chrome on a 8GB Mac. Sees lifespan of SSD being reduced by 50%. After 2-3 years of heavy usage SSD starts to get errors. Apple solution: buy a new one. No wonder they are 2nd/3rd wealthiest company on the planet.

  • SpeedLimit55@lemmy.world
    link
    fedilink
    English
    arrow-up
    45
    arrow-down
    6
    ·
    11 days ago

    8GB is definitely not enough for coding, gaming, or most creative work but it’s fine for basic office/school work or entertainment. Heck my M1 Macbook Air is even good with basic Photoshop/Illustrator work and light AV editing. I certainly prefer my PC laptop with 32GB and a dedicated GPU but its power adapter weighs more than a Macbook Air.

    • cmnybo@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      35
      arrow-down
      2
      ·
      11 days ago

      8GB would be fine for basic use if it was upgradable. With soldered RAM the laptop becomes e-waste when 8GB is no longer enough.

      • slaacaa@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        2
        ·
        edit-2
        11 days ago

        Yeah, the soldering is outrageous. I miss the time when Apple was a (more) customer friendly company. I could open my Mac mini 2009 and just add more RAM, which I did.

        • DJDarren@thelemmy.club
          link
          fedilink
          English
          arrow-up
          10
          ·
          11 days ago

          When I bought my first MacBook in ‘07 I asked the guy in the store about upgrading the RAM. He told me that what Apple charged was outrageous and pointed me to a website where I’d get what I needed for much less.

          I feel that if Apple could have soldered the RAM back then, they would have.

          • boonhet@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 days ago

            I feel that if Apple could have soldered the RAM back then, they would have.

            Apple used to ship repair and upgrade kits with guides on how to apply them. Not sure they were as anti-repair then as they are now.

    • Specal@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      10 days ago

      I mean I develop software on an 8GB laptop. Most of the time it’s fine, when I need more I have a desktop with 128GB ram available.

      Really depends what type of software you’re making. If you’re using python a few TB might be required.

    • cheddar@programming.dev
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      2
      ·
      10 days ago

      8GB is definitely not enough for coding, gaming, or most creative work but it’s fine for basic office/school work or entertainment.

      The thing is, basic office/school/work tasks can be done on any laptop that costs twice less than an 8GB MacBook.

      • SpeedLimit55@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        3
        ·
        edit-2
        10 days ago

        This is true for part time or casual use but for all day work use including travel you get better build quality and far less problems with a pro grade machine. We spend the same on a macbook, thinkpad, surface or probook for our basic full time users.

        While it may be a bit overkill for someone who spends their day in word, excel, chrome and zoom we save money in the long term due to reliability. There is far less downtime and IT time spent on each user over the life of the system (3-4 years). The same is true about higher quality computer accessories.

    • woelkchen@lemmy.world
      link
      fedilink
      English
      arrow-up
      30
      ·
      10 days ago

      Shipping with Windows S. That’s Microsoft’s version of a Chromebook for some light web browsing for 188 dollars. I wouldn’t buy it but this doesn’t look like a rip off at this price point.

        • n0clue@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          ·
          10 days ago

          And if they raised the price to $250, they could go with a faster processor and better wifi!

      • purplemonkeymad@programming.dev
        link
        fedilink
        English
        arrow-up
        6
        ·
        10 days ago

        S mode does allow you to turn it off, so it’s more like a hobbled version of home.

        The computer is as bad as one I saw several years ago with 64g emmc and “Quad core processor.” not a quad core, it was literally the name that showed in system. It did have 4 cores: at 400Mhz, boosting to 1.1Ghz. Buyer changed their mind and we couldn’t give it away.

        • woelkchen@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          2
          ·
          10 days ago

          Of course that notebook is bad but for the price point of shitty hardware, you get shitty hardware. Apple sells shitty hardware at the cost of premium hardware.

    • homura1650@lemm.ee
      link
      fedilink
      English
      arrow-up
      5
      ·
      10 days ago

      At a $188 price point. An addition 4GB of memory would probably add ~$10 to the cost, which is over a 5% increase. However, that is not the only component they cheaped out on. The linked unit also only has 64GB of storage, which they should probably increase to have a usable system …

      And soon you find that you just reinvented a mid-market device instead of the low-market device you were trying to sell.

      4GB of ram is still plenty to have a functioning computer. It will not be as capable of a more powerful computer, but that comes with the territory of buying the low cost version of a product.

      • PraiseTheSoup@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 days ago

        I don’t even understand how HP still exists. Can anyone name a single product they’ve made in the last ~15 years that wasn’t a complete piece of junk?

        • RunawayFixer@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 days ago

          I really like their pagewide xl printers, but those are purely aimed at businesses. Just to name one thing I like :D

          And those xl printers are the only thing that I can think off. I won’t even consider buying a current HP computer/laptop/small printer/…

  • small44@lemmy.world
    link
    fedilink
    English
    arrow-up
    31
    arrow-down
    15
    ·
    edit-2
    11 days ago

    For who? My mother who only use facebook, youtube and googling don’t need 8gb

          • cybersandwich@lemmy.world
            link
            fedilink
            English
            arrow-up
            18
            ·
            11 days ago

            This comment chain made me chuckle. It’s such an “internet comment section” …trope? I don’t know the right word.

              • Balder@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                10 days ago

                The whole thread is just nonsense though. People went from “Apple restricts local LLM inference in Xcode development to 16GB” to “Apple admits 8GB is useless” and a lot of people who never researched anything about the problems of the traditional memory architecture criticizing the unified memory model without understanding it.

      • disguy_ovahea@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        edit-2
        11 days ago

        That all depends on how much work they want to put into troubleshooting it for her. I got my mom a Mac Mini when her PC needed to be replaced. It’s way less responsibility on my part. I mostly just answer the occasional how-to.

        • Glowstick@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          10 days ago

          Mac is easier than Windows, sure, but not easier than a chromebook. Nothing is simpler than a Chromebook. You can do much more with a Mac, but a chromebook is much easier.

    • ABCDE@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      2
      ·
      11 days ago

      I don’t know what Xcode is so yeah, I haven’t been found wanting with my 8GB M2. Videos, downloading, web browsing, writing, chat applications, some photo editing, games (what I can actually play on a Mac, anyway), all good here.

      16GB+ is obviously going to be necessary though, and not exactly that expensive to put into their base models so it should be put in soon.

    • iopq@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      11 days ago

      I had a laptop with 8GB. Doing one of those things was fine, but when you open up another program it takes forever to switch to the browser

      • Petter1@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 days ago

        And then you have to activate linux app support for a thing she needs and can not do with chromebook and suddenly it is more complicated than macOS?

  • kingthrillgore@lemmy.ml
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    10 days ago

    They moved to on-die RAM for a reason: To nickel and dime yo ass.

    I needed to expense a Mac Mini for iOS development, and everyone (Me, the company, our purchasing department) was baffled at how much it cost to get 16 GB. And they only go up to 24GB. Imagine how much they’ll charge for 32 in a year!

    • stoly@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      10 days ago

      Mac Mini is meant to be sort of the starter desktop. For higher end uses, they want you on the Mac Studio, an iMac, or a Mac Pro.

  • RecluseRamble@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    15
    ·
    10 days ago

    I can’t believe, there’s no Linux reference yet!

    Give your “8 gigs not enough” hardware to one of us and see it revived running faster than whatever you’re running now with your subpar OS.

    • mightyfoolish@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      3
      ·
      10 days ago

      Software and AI development would be hard with 8gb of RAM on Linux. Having you seen the memes on AI adding to global climate change? Not even Linux can fix the issues with ChatGPT…

      • prole@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        1
        ·
        edit-2
        10 days ago

        I don’t think anyone anywhere is claiming 8GB RAM is enough for software and AI development. Pretty sure we’re talking about consumer-grade hardware here. And low-end at that.

        • monnier@lemmy.ca
          link
          fedilink
          English
          arrow-up
          5
          ·
          10 days ago

          My main development machine has 8 GB, for what it’s worth. And most of the software in use nowadays was developped when 8GB was a lot of RAM

        • Kazumara@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          10 days ago

          The lede by OP here contains this:

          […] addition to Xcode 16 […] is a feature called Predictive Code Completion. Unfortunately, if you bought into Apple’s claim that 8GB of unified memory was enough for base-model Apple silicon Macs, you won’t be able to use it

          So either RecluseRamble meant that development with a feature like predictive code completion would work on 8 GB of RAM if you were using Linux or his comparison was shit.

          • RecluseRamble@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            10 days ago

            That’s absolutely what I’m saying. Apple is just holding back that feature for upselling (as always) and because it’s hardly possible to debloat macOS.

            • Kazumara@discuss.tchncs.de
              link
              fedilink
              English
              arrow-up
              1
              ·
              10 days ago

              Okay good, thanks for confirming. I remember Kate feeling very nice to use during my studies, more responsive than VS Code or Eclipse. But I also had 16Gigabytes of RAM, so I couldn’t be sure.

    • RedWeasel@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      10 days ago

      I actually bought a m1 mini for a linux low power server. I was getting tired of the Pi4 being so slow when I needed to compile something. Works real well, just need the Asahi team to get TB working. And for my server stuff, 8gb is plenty.

      • 🦄🦄🦄@feddit.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 days ago

        You wouldn’t happen to run a jellyfin server on that mac mini would you? Currently looking to find something performant with small form factor and low power consumption.

        • Telodzrum@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          ·
          10 days ago

          I’ve run Plex servers on Mac Minis (M1). Docker on MacOS runs well finally — the issues that were everywhere a couple of years ago are resolved.

          It ran very well on the hardware. The OP of this post is right, 8gb is not enough in 2024; however I would also wager that the vast majority of commenters have not used MacOS recently or regularly. It is actually very performant and has a memory scheduler that rivals that found on GNU/Linux. Apple’s users aren’t wrong when they talk about how much better the OS is than Windows at using memory.

        • RedWeasel@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          10 days ago

          No I do not, but I don’t see any reason it shouldn’t work though. I have PiHole, Apache, email, cups, mythtv and samba currently.

    • el_abuelo@lemmy.ml
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      6
      ·
      10 days ago

      I’d love to see you run xcode 16 code completion on your superior OS. Send me a link once you’ve uploaded the vid.

      • Mojave@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        2
        ·
        10 days ago

        Why limit it to proprietary software? Almost every linux distro can run Github Copilot X and Jetbrains, which both have had more time to be publicly used and tested and work better in my opinion.

        Send me a video link of Mac having direct access to containers without using a VM (which ruins the point of containers). THAT is directly related to my actual work, as opposed to needing a robot to code for me specifically using Apple’s AI

        • el_abuelo@lemmy.ml
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          10 days ago

          Because that was what the article was about…I actually am a Linux user and fan, folks just misreading the intentions of my post.

          I would genuinely love to see it, because I’m stuck on mac hardware to do my job and I really hope one day they get crucified for their anticompetative practices so I can freely choose the OS my business uses.

      • RedWeasel@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        10 days ago

        There is a project being worked on called Darling, but it isn’t ready yet. The developers are making progress though.

      • RecluseRamble@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        10 days ago

        As I said: feel free to upgrade your MacBook just don’t throw the one with a “meager” 8 gigs away since it’s totally usable with a non-bloated system.

  • poorlytunedAstring@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    9
    ·
    11 days ago

    For the record, on Windows 10, I’m using 9GB (rounded up from 8.something) to run Firefox and look at this website, can’t forget Discord inviting itself to my party in the background, and the OS. I had to close tabs to get down here. Streams really eat the RAM up.

    Throw a game in there, with FF open for advice and Discord running for all the usual gaming reasons, and yeah, way over.

    Notice I haven’t even touched any productivity stuff that demands more.

    8? Eat a penis, Apple. Fuckin clown hardware.

    • howlingecko@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      6
      ·
      11 days ago

      Also for the record, I have experienced an 8GB Mac Mini run Firefox with at least 20 tabs, Jetbrains Rider with code open and editable, Jetbrains DataGrip with queries, somehow Microsoft Teams, MS Outlook and didn’t seem to have a problem. Was also able to share the screen on a Teams call and switch between the applications without lag.

      Windows OS couldn’t handle your application load? Eat a penis, Microsoft. Fucking clown memory management.

    • Telodzrum@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      10 days ago

      MacOS’s memory scheduler is leaps and bounds better than what Windows uses. It’s more apt to compare the RAM on a machine running MacOS to one running a common Linux distro. Windows needs more RAM than the other two by two to three times because it’s fuckterrible at using it.

  • vermyndax@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    16
    ·
    10 days ago

    Apple has said that 8gb was enough for “general use,” meaning if you use the out-of-the-box applications (Safari, Pages, Numbers, Keynote, etc.) then 8gb is enough for general use to get basic things done. Apple is not going to say how much RAM is required for a third party application to run. That would be impossible. (Especially Chrome).

    This article says that the limitation is occurring when running Xcode 16 with code completion. This is outside the definition of general use. Most people who are buying 8gb Macs are not going to be running Xcode at all.

    The article and most of these comments are way, way outside the realm of common sense and simply looking for a reason to attack Apple.

    If you don’t want to buy Apple, don’t buy it… and in the process, shut the fuck up.

    • stoly@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      10 days ago

      Sorry, boo, everyone wants to hate Apple these days. It’s the Zeitgeist. Even if you say something reasonable or perhaps factual, the people are against you and will react violently.

    • Shadywack@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      10 days ago

      If you don’t want to buy Apple, don’t buy it…

      Okay, we won’t!

      and in the process, shut the fuck up.

      Kiss my ass and go fuck yourself. If your opinions are so fragile that you can’t handle people rightly pointing this out, sounds like you have the problem, not everyone else Mr Skinner.

    • MiDaBa@lemmy.ml
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      14
      ·
      10 days ago

      If you choose to be a weak little quiet corporate Stan then that’s up to you. Apple is well aware that third party apps exist and they’re well aware that machines with less ram will need replaced far sooner than machines with more. RAM is cheap and Apples intigrated memory is no different in the regard. The only reason to use less is planned obsolescence. If you don’t believe that then you’re either Tim Cook or you’re an idiot.

      • vermyndax@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        10
        ·
        10 days ago

        What is the obsession with shitting on people’s choices? I don’t understand the irony of demanding choice in this industry, then shitting on people when they make a choice you don’t agree with.