This is so fucking stupid. All Gemini (and every other LLM) does is find info on the internet and regurgitate it back to you. The LLM part just does it’s best to narrow it down to the specific answer you’re looking for much more quickly and summarize it. It has no way to disseminate fact from fiction.
Same goes for the Google search engine. Does it suck? Yes. Is it worse than it used to be? Yes. Is that Google’s fault? Not necessarily, it’s moreso the rise of AI just uploading mountains of bullshit to the internet. Every other search engine has the same problems.
Although I am in no way advocating for the use of Google search…
Too be fair to Gemini, even though it is worse than Claude and Gpt. The weird answer were caused by bad engineering and not by bad model training. They were forcing the incorporattion off the Google search results even though the base model would most likely have gotten it right.
Is Google Gemini that AI product that told people to put glue on their spaghetti?
Excuse me that was actually on pizza
What? Are you too good for some authentic Italian paste-a?
This is so fucking stupid. All Gemini (and every other LLM) does is find info on the internet and regurgitate it back to you. The LLM part just does it’s best to narrow it down to the specific answer you’re looking for much more quickly and summarize it. It has no way to disseminate fact from fiction.
Same goes for the Google search engine. Does it suck? Yes. Is it worse than it used to be? Yes. Is that Google’s fault? Not necessarily, it’s moreso the rise of AI just uploading mountains of bullshit to the internet. Every other search engine has the same problems.
Although I am in no way advocating for the use of Google search…
Too be fair to Gemini, even though it is worse than Claude and Gpt. The weird answer were caused by bad engineering and not by bad model training. They were forcing the incorporattion off the Google search results even though the base model would most likely have gotten it right.