• ebu@awful.systems
    link
    fedilink
    English
    arrow-up
    10
    ·
    4 months ago

    The point is that even if the chances of [extinction by AGI] are extremely slim

    the chances are zero. i don’t buy into the idea that the “probability” of some made-up cataclysmic event is worth thinking about as any other number because technically you can’t guarantee that a unicorn won’t fart AGI into existence which in turn starts converting our bodies into office equipment

    It’s kind of like with the trinity nuclear test. Scientists were almost 100% confident that it wont cause a chain reaction that sets the entire atmosphere on fire

    if you had done just a little bit of googling instead of repeating something you heard off of Oppenheimer, you would know this was basically never put forward as serious possibility (archive link)

    which is actually a fitting parallel for “AGI”, now that i think about it

    EDIT: Alright, well this community was a mistake…

    if you’re going to walk in here and diarrhea AGI Great Filter sci-fi nonsense onto the floor, don’t be surprised if no one decides to take you seriously

    …okay it’s bad form but i had to peek at your bio

    Sharing my honest beliefs, welcoming constructive debates, and embracing the potential for evolving viewpoints. Independent thinker navigating through conversations without allegiance to any particular side.

    seriously do all y’all like. come out of a factory or something