• HardlightCereal@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    5
    ·
    1 year ago

    Language is a method for encoding human thought. Mastery of language is mastery of human thought. The problem is, predictive text heuristics don’t have mastery of language and they cannot predict desired output

    • cloudy1999@sh.itjust.works
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      1 year ago

      I thought this was an inciteful comment. Language is a kind of ‘view’ (in the model view controller sense) of intelligence. It signifies a thought or meme. But, language is imprecise and flawed. It’s a poor representation since it can be misinterpreted or distorted. I wonder if language based AIs are inherently flawed, too.

      Edit: grammar, ironically

      • HardlightCereal@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Language based AIs will always carry the biases of the language they speak. I am certain a properly trained bilingual AI would be smarter than a monolingual AI of the same skill level

    • Fedora@lemmy.haigner.me
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Sorry, but you oversimplify a lot here, it hurts. Language can express and communicate human thought, sure, but human thought is more than language. Human thought includes emotions, experiences, abstract concepts, etc. that go beyond what can be expressed through language alone. LLMs are excellent at generating text, often more skilled than the average person, but training data and algorithms limit LLMs. They can lack nuances of context, tone, or intent. TL;DR.: Understanding language doesn’t imply understanding human thought.

      I’d love to know how you even came to your conclusion.

      • HardlightCereal@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 year ago

        Many languages lack words for certain concepts. For example, english lacks a word for the joy you feel at another’s pain. You have to go to Germany in order to name Schadenfreude. However, English is perfectly capable of describing what schadenfreude is. I sometimes become nonverbal due to my autism. In the moment, there is no way I could possibly describe what I am feeling. But that is a limitation of my temporarily panicked mind, not a limitation of language itself. Sufficiently gifted writers and poets have described things once thought indescribable. I believe language can describe anything with a book long enough and a writer skilled enough.

    • MajorHavoc@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      “Mastery of language is mastery of human thought.” is easy to prove false.

      The current batch of AIs is an excellent data point. These things are very good at language, and they still can’t even count.

      The average celebrity provides evidence that it is false. People who excel at science often suck at talking, and vice-versa.

      We didn’t talk our way to the moon.

      Even when these LLMs master language, it’s not evidence that they’re doing any actual thinking, yet.