• Nikls94@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    14 days ago

    Well… it’s not capable of being moral. It answers part 1 and then part 2, like a machine

    • CTDummy@aussie.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      14 days ago

      Yeah these “stories” reek of blaming a failing -bordering on non-existent (in some areas)- mental health care apparatus on machines that predict text. You could get the desired results just googling “tallest bridges in x area”. That isn’t a story that generates clicks though.