MemoryCache is an experimental developer project to turn a local desktop environment into an on-device AI agent.

  • fruitycoder@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    41
    arrow-down
    1
    ·
    10 months ago

    Open source project focused on giving people features they want but in a privacy and censorship resistant way. Classic Moz

    • Revv@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      32
      ·
      10 months ago

      Seriously, what’s with all the Mozilla hate on Lemmy? People bitch about almost everything they do. Sometimes it feels like, because it’s non-profit/open-source, people have this idealized vision of a monastery full of impoverished, but zealous, single-minded monks working feverishly and never deviating from a very tiny mission.

      Cards on the table, I remain an AI skeptic, but I also recognize that it’s not going anywhere anytime soon. I vastly prefer to see folks like Mozilla branching out into the space a little than to have them ignore it entirely and cede the space to corporate interests/advertisers.

        • LWD@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          Can you show some examples of where people complain about Mozilla taking Google money?

          Because when I complain about Mozilla, it’s because they fired their employees while bloating the salary of their CEO, that Firefox languishes while they throw in privacy invasive junk that nobody asked for.

      • fruitycoder@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        That seems more aligned with their mission of fighting misinformation on the web. It looks like fake spot was an acquisition so hopefully efforts like the ones mentioned in this post better help aligne this with their other goals.

        • LWD@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          That observation is… Very tangential to my comment. I’m not sure if anyone asked Mozilla Corp to start violating people’s privacy, and purchasing data sets, in order to allegedly fight misinformation (while showing ads in the same place, of course)…

          • fruitycoder@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            What I’m saying is Mozzilla, from my understanding, didn’t set out to do that but instead aqquired a business that was in order to use their services to fight misinformation. We should pressure them to reform the new part of business to better align with the rest of Mozzilla’s goals.

            • LWD@lemm.ee
              link
              fedilink
              English
              arrow-up
              2
              ·
              10 months ago

              I’ve been trying. No luck so far. The only change to the Fakespot TOS was adding an allowance for private data to get sold to Mozilla…

    • vrighter@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      17
      ·
      10 months ago

      but it is not a feature i want. not now, not ever. An inbuilt bullshit generator, now with less training and more bullshit is not something I ever asked for.

      Training one of these ais requires huge datacenters, insanely huge datasets and millions of dollars in resources. And I’m supposed to believe one will be effectively trained by the pittance of data generated by browsing?

      • fruitycoder@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        10 months ago

        Fine tunning is more possible on end user hardware. You also have projects like hive mind and petals that working on distributed training and inference systems to deal with the concentration effects of this you described for base models.