• jasep@lemmy.world
      link
      fedilink
      arrow-up
      15
      arrow-down
      1
      ·
      5 months ago

      Baked in AI makes C Suite and shareholders happy. That’s about it.

    • fmstrat@lemmy.nowsci.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      The implementation doesn’t sound terrible.

      • It’s opt-in
      • It’s basically a sidebar chat window

      So if you already use GPT for day-to-day, it may be a welcome experience. If you don’t, don’t opt in.

      I’m skeptical of GPT add-ons, bit at least this was done in a low-bloat opt-in way (which allows Mozilla to bring in revenue (probably)).

  • GolfNovemberUniform@lemmy.ml
    link
    fedilink
    arrow-up
    18
    arrow-down
    1
    ·
    edit-2
    5 months ago

    Looks like the “local AI only” idea was purged in favor of some Big Tech stuff that can give Mozilla some fat cash for promoting their services! Mozilla’s second (or third idk at this point lol) downfall is looking really strong with all their recent decisions. WebKit is another independent engine that still doesn’t seem to suck in terms of enshittification but it’s basically not used anywhere except Apple ecosystem. Chromium is getting a full monopoly yay.

    • daniskarma@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      8
      arrow-down
      1
      ·
      5 months ago

      I do self host several AI applications for myself on a low end device and I think for most lowend even mid devices local AI is unfeasible. Nowadays is too much resource heavy and times are too long without high end devices.

      For my computer generating a description of a picture (one of the firefox new features) could easily take up to 5-10 minutes with the cpu at 100%. That’s just not viable for doing while browsing.

      Anyway I would love for firefox to open source the server side of this. So in case someone have s computer powerful enough they could do it locally if they want to.

    • Captain Janeway@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      5 months ago

      Well I’m guessing they actually did testing on local AI using a 4GB and 8GB RAM laptop and realized it would be an awful user experience. It’s just too slow.

      I wish they rolled it in as an option though.

  • theshatterstone54@feddit.uk
    link
    fedilink
    arrow-up
    12
    ·
    5 months ago

    So it isn’t even local private AI but rather just an Interface for NOT-private LLMs like ChatGPT (which specifically stated, at least at first, that all your queries to it and their responses are being monitored and saved by OpenAI)