You are viewing a single comment's thread.

view the rest of the comments →

0
1

[–] madmardigan 0 points 1 points (+1|-0) ago 

I'm actually curious what everyone thinks came "first"

Did men abandon their role first or did women?

Solutions?

0
3

[–] Charlez6 0 points 3 points (+3|-0) ago 

The welfare state kicked it off. You could argue that's the fault of weak willed men, if you want to. With that, women had their guaranteed source of income, defence, etc. Any time they need anything they go crying to the government and the government delivers it for them.

Women could then afford to screw the losers (the 'bad boys') knowing that the government will make the responsible, hardworking, taxpaying men pay their bills. The government cucked decent men.

So why bother working hard, being reputable, making an honest living, being ambitious, etc? Waste of time when you can either swipe right for pussy or accept that you're never getting any anyway. And so men were ruined too.

1
3

[–] cuello_rojo 1 points 3 points (+4|-1) ago  (edited ago)

Women went wrong long before men bailed. Eventually all the redeeming virtues, except their sexuality, had been removed from (American) women.

Listen to sappy love songs from the 60s and 70s and you'll see how guys used to think of women. Those days are long gone.

0
0

[–] nrlftw 0 points 0 points (+0|-0) ago 

  1. Feminism movements. Men allowed women to walk all over them. Men stopped making the decisions and allowed women to come into authority positions with little opposition. Women only respect strong men and so they continued to walk all over them.