• SpiderShoeCult
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    4
    ·
    9 months ago

    I would point you to Hanlon’s razor for the first part there.

    it’s not about dehumanizing, it’s merely comparing the outputs. it doesn’t really matter if they act for reasons or have thoughts if the output is the same. should we be more forgiving if a LLM outputs crap because it’s just a tool or should we be more forgiving if the human outputs the exact same crap, because it’s a person?

    and, just for fun, to bring solipsism into this, how do we actually know that they have thoughts?