16 tins of Spam
Well-known member
In recent years, we in the western world have been forced to acknowledge our sometimes shameful history that has included the oppression of racial minorities, women, homosexuals, certain religions, and so on. Now, in the age of political correctness it seems that guilt is being brought to bear by a seemingly endless parade of special interest groups.
Perhaps one way of interpreting all this is that at some point pretty much every section of society is going to be considered untouchable in terms of criticism, being the object of humour, or practically anything that might be construed as offensive. Perhaps a further extension of this is that one section of society might end up having to carry that burden of guilt and reparation for all of us. And to be honest guys... I think it's gonna be us.
Lately I've noticed a certain amount of weight tipping the scales away from the male hegemony that's ruled the world for the last few millennia, beyond the point of balance, and into a free-for-all of condemnation, petty name calling and downright slander. I'm not saying that this has reached significant proportions, but I can feel attitudes slipping that way.
Guys, any thoughts? Has anyone felt like they're being blamed for what's wrong with everyone else's lives? Anyone felt marginalised? Been made to feel guilty for being male and having money and a job? Ladies? Your perspective is always appreciated
Perhaps one way of interpreting all this is that at some point pretty much every section of society is going to be considered untouchable in terms of criticism, being the object of humour, or practically anything that might be construed as offensive. Perhaps a further extension of this is that one section of society might end up having to carry that burden of guilt and reparation for all of us. And to be honest guys... I think it's gonna be us.
Lately I've noticed a certain amount of weight tipping the scales away from the male hegemony that's ruled the world for the last few millennia, beyond the point of balance, and into a free-for-all of condemnation, petty name calling and downright slander. I'm not saying that this has reached significant proportions, but I can feel attitudes slipping that way.
Guys, any thoughts? Has anyone felt like they're being blamed for what's wrong with everyone else's lives? Anyone felt marginalised? Been made to feel guilty for being male and having money and a job? Ladies? Your perspective is always appreciated