• Nikls94@lemmy.world
    link
    fedilink
    English
    arrow-up
    74
    arrow-down
    1
    ·
    3 days ago

    Well… it’s not capable of being moral. It answers part 1 and then part 2, like a machine

    • CTDummy@aussie.zone
      link
      fedilink
      English
      arrow-up
      47
      arrow-down
      3
      ·
      edit-2
      3 days ago

      Yeah these “stories” reek of blaming a failing -bordering on non-existent (in some areas)- mental health care apparatus on machines that predict text. You could get the desired results just googling “tallest bridges in x area”. That isn’t a story that generates clicks though.

    • fckreddit@lemmy.ml
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      3 days ago

      Being ‘moral’, means to have empathy. But empathy is only possible between two beings that share experiences and reality or at least some aspects of it. LLMs don’t have experiences, but it builds it’s weights from training data. It is fundamentally a computer program. Just textual information is not enough to build deep context. For example, when I say “this apple is red”, anyone reading this can easily visualize a red apple because of your experience seeing a apple. That cannot be put into text because it is a fundamental part of human experience that is not available to a computer program, as of yet.

      At least that is my hypothesis. I can very obviously be wrong., which is another fundamentally human experience.

      • Zikeji@programming.dev
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        3 days ago

        This reply is more of a light hearted nitpick and not replying to the substance of your comment but…

        For example, when I say “this apple is red”, anyone reading this can easily visualize a red apple because of your experience seeing a apple.

        To be fair you said anyone, not everyone, but as someone with aphantasia I can’t relate to this. I can’t visualize an apple.