- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
What purpose is served by having AI built-in to the browser?
Baked in AI makes C Suite and shareholders happy. That’s about it.
The implementation doesn’t sound terrible.
- It’s opt-in
- It’s basically a sidebar chat window
So if you already use GPT for day-to-day, it may be a welcome experience. If you don’t, don’t opt in.
I’m skeptical of GPT add-ons, bit at least this was done in a low-bloat opt-in way (which allows Mozilla to bring in revenue (probably)).
“Move all my rf resesrch tabs to a new window”
Looks like the “local AI only” idea was purged in favor of some Big Tech stuff that can give Mozilla some fat cash for promoting their services! Mozilla’s second (or third idk at this point lol) downfall is looking really strong with all their recent decisions. WebKit is another independent engine that still doesn’t seem to suck in terms of enshittification but it’s basically not used anywhere except Apple ecosystem. Chromium is getting a full monopoly yay.
I do self host several AI applications for myself on a low end device and I think for most lowend even mid devices local AI is unfeasible. Nowadays is too much resource heavy and times are too long without high end devices.
For my computer generating a description of a picture (one of the firefox new features) could easily take up to 5-10 minutes with the cpu at 100%. That’s just not viable for doing while browsing.
Anyway I would love for firefox to open source the server side of this. So in case someone have s computer powerful enough they could do it locally if they want to.
Still adding proprietary and actually evil AI providers is a questionable decision.
Well I’m guessing they actually did testing on local AI using a 4GB and 8GB RAM laptop and realized it would be an awful user experience. It’s just too slow.
I wish they rolled it in as an option though.
They wanted to use fast small language models, not LLMs like Llama
Ah yes, the update nobody actually wants…
So it isn’t even local private AI but rather just an Interface for NOT-private LLMs like ChatGPT (which specifically stated, at least at first, that all your queries to it and their responses are being monitored and saved by OpenAI)
Why are they not using their own llamafile ?
Uggggh mozilla no one wants this