• samus12345@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 months ago

    I could see myself having conversations with an LLM, but I wouldn’t want it to pretend it’s anything other than a program assembling words together.

      • Not_mikey@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        2 months ago

        If llms are juiced up auto complete then humans are juiced up bacteria. Yeah they both have the same end goal, guess the next word, survive and reproduce , but the methods they use to accomplish them are vastly more complex.

        • Swedneck@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          but it’s still literally just looking at the text and calculating what’s the most likely thing to follow, that’s fundamentally how LLMs work

    • Droggelbecher@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      It’s not pretending to be anything, that’s just the function you described: assembling words together.