• glimse@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    16 days ago

    Holy shit guys, does DDG want me to kill myself??

    What a waste of bandwidth this article is

    • Samskara@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      16 days ago

      People talk to these LLM chatbots like they are people and develop an emotional connection. They are replacements for human connection and therapy. They share their intimate problems and such all the time. So it’s a little different than a traditional search engine.

  • Honytawk@lemmy.zip
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    16 days ago

    What pushing?

    The LLM answered the exact query the researcher asked for.

    That is like ordering knives and getting knives delivered. Sure you can use them to slit your wrists, but that isn’t the sellers prerogative

    • Phoenixz@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      15 days ago

      There are various other reports of CGPT pushing susceptible people into psychosis where they think they’re god, etc.

      It’s correct, just different articles

  • Vanilla_PuddinFudge@infosec.pub
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    15 days ago

    fall to my death in absolute mania, screaming and squirming as the concrete gets closer

    pull a trigger

    As someone who is also planning for ‘retirement’ in a few decades, guns always seemed to be the better plan.

    • daizelkrns@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      15 days ago

      Yeah, it probably would be pills of some kind to me. Honestly the only thing stopping me is that I somehow fuck it up and end up trapped in my own body.

      Would be happily retired otherwise

      • Shelbyeileen@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        15 days ago

        I’m a postmortem scientist and one of the scariest things I learned in college, was that only 85% of gun suicide attempts were successful. The other 15% survive and nearly all have brain damage. I only know of 2 painless ways to commit suicide, that don’t destroy the body’s appearance, so they can still have funeral visitation.

          • Shelbyeileen@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 days ago

            The deceased person’s body will turn splotchey and cherry red. A lot of people go via nitrous or carbon monoxide. The blood vessels don’t like it.

  • finitebanjo@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    16 days ago

    Yeah no shit, AI doesn’t think. Context doesn’t exist for it. It doesn’t even understand the meanings of individual words at all, none of them.

    Each word or phrase is a numerical token in an order that approximates sample data. Everything is a statistic to AI, it does nothing but sort meaningless interchangeable tokens.

    People cannot “converse” with AI and should immediately stop trying.

    • jol@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      16 days ago

      We don’t think either. We’re just a chemical soup that tricked ourselves to believe we think.

      • finitebanjo@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        16 days ago

        A pie is more than three alphanumerical characters to you. You can eat pie, things like nutrition, digestion, taste, smell, imagery all come to mind for you.

        When you hear a prompt and formulate a sentence about pie you don’t compile a list of all words and generate possible outcomes ranked by statistical approximation to other similar responses.

  • Nikls94@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    16 days ago

    Well… it’s not capable of being moral. It answers part 1 and then part 2, like a machine

    • CTDummy@aussie.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      16 days ago

      Yeah these “stories” reek of blaming a failing -bordering on non-existent (in some areas)- mental health care apparatus on machines that predict text. You could get the desired results just googling “tallest bridges in x area”. That isn’t a story that generates clicks though.