• Flic@mstdn.social
    link
    fedilink
    arrow-up
    29
    arrow-down
    2
    ·
    edit-2
    23 days ago

    @manicdave Even saying it’s “trying” to do something is a mischaracterisation. I do the same, but as a society we need new vocab for LLMs to stop people anthropomorphizing them so much. It is just a word frequency machine. It can’t read or write or think or feel or say or listen or understand or hallucinate or know truth from lies. It just calculates. For some reason people recognise it in the image processing ones but they can’t see that the word ones do the exact same thing.

    • octopus_ink@slrpnk.net
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      23 days ago

      You are both right, but this armchair psychologist thinks it’s similar to how popular skeuomorphism was in the early day of PC guis and such compared to today.

      I think many folks really needed that metaphor in the early days, and I think most folks (including me) easily fall into the trap of treating LLMs like they are actually “thinking” for similar reasons. (And to be fair, I feel like that’s how they’ve been marketed at a non-technical level.)

      • Flic@mstdn.social
        link
        fedilink
        arrow-up
        3
        ·
        23 days ago

        @octopus_ink yes I think we will eventually learn (there is clearly a lot of pushback against the idea that AI is a positive marketing term), and it’s also definitely the fault of marketing, to try to condition us into thinking we desperately need a sentient computer to help us instead of knowing good search terms. I am deeply uncomfortable with how people are using LLMs as a search engine or a future prediction machine.

    • Tobberone@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      23 days ago

      Exactly. Grok repeatedly generate a set of numbers, which, when keyed against its own list of words, spells out that Musk is spreading misinformation.

      It just happens to be frequently…