• pinkdrunkenelephants
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    9 months ago

    They were. Most of the history we were taught was nothing more than pro-America propaganda.

    Like for example, the true horrors of slavery aren’t actually commonly known, nor is the true extent of the effects of post-Civil War racist policies like redlining. Or that “crimes” like loitering and trespassing are actually holdovers from fucking Jim Crow laws. Or that American Mixed people originated as the rape babies of slaves.

    Or even colonization. Did you know the stupid fucking goddamn Belgian government was the root cause of the Rwandan genocide? They purposefully pitted the Hutus and the Tutsis against each other by giving the Tutsis special privileges and land and shit decades beforehand, playing on their flimsy understanding of the cultural order Hutus and Tutsis already had, enraging the Hutus. And the Belgian government never owned up or took responsibility for it. It wasn’t just France. Macron legit did apologize for the French government’s role but Belgium never did.

    Who here was taught about how the U.S. overthrew legit governments in South America and replaced them with dictators?

    Or that Libya was bombed to hell and back not because their dictator was a dictator but because he wanted to start selling oil in gold and not U.S. dollars?

    Who is ever taught the true nature of any of this shit?