I don’t know where this Neanderthal culture comes from. I thought society was waaaaaaaay past this.
I can’t help thinking that social media gave them this platform to bond over their anti-female stance at the same time as desiring them.
Where did this overly toxic masculinity grow? Is it college jock culture where they learn to despise women for being their own selves, not beholden to some outdated concept written for them?
I despair. Haters gonna Hate. I think we as a society have to simply move on and ignore them as much as we can.
I’m a straight Brit male with 2 kids living on the E Coast and this feels like a very American cultural shortfall.