Some of these axioms and suggestions all just seem like common sense to me, and not some huge other-women-are-entitled-bitches-etc (I'm really not trying to boil it down to that attitude, but that's all 9am is giving me) movement. Building up women by encouraging self-esteem, fostering creativity and honest, two-way communication, providing equal opportunities, etc, seem like a better use of time of promoting intergender relations than all of this loaded language.
It feels like something declaring that not all girls like giving blow jobs. At first, it's like, who doesn't like giving blow jobs??! Then it's like, if you actually think about it, that totally makes fucking sense. You try shoving and sucking a hard, ~6 inch organ in your mouth and not gag.
To flip it, a [stupid] girl can be turned off if a guy needs to stop for a moment because of arm fatigue. But, if you actually think about it, of fucking course. You try thrusting your pelvis postured like a seal powered by primordial dude goo at 1.5 thrusts/minute for 10 minutes, ratchet it up for the last 30 seconds and not get tired.
Communicating upfront you don't like blow jobs saves so much heartache than trying to look like a down, cool girl who'll suck some bird when really you just hate it. Then, either you and your partner work out another oral deal or you move on.
Unfortunately, our genders make us different. That is our biology. I feel we should encourage doing what FEELS natural to YOU. Inviting people to be themselves makes such a more positive-feedback environment. Anything we do as humans is inherently natural, so we should celebrate it. If it's a woman taking the lead during sex more often or always doing dishes, or a man being the sole breadwinner or the laundry wizard, fuck yeah to it all because if you're being true to you, then that's it.
Do you look at flowers and deny them picking when, if you actually look at them, are just as distinct and unique as labia majora?
There are things I agree with in "red pill" speak, like evolutionary psychology effects and filling the feminine role, but it all feels very bombastic. Equal roles =/= putting another down despite power differences. It's kind of like being blind. Without vision, other senses have to get stronger. That is a natural reaction. Filling your role should be natural.
Feminism should not have a negative connotation. If feminism, ideally, promoted equality for everyone, how would that be bad? I see the bastardized version we're served today, but if that's its original message, why work in spreading an anti- message as opposed to building back up again? I feel like identifying with some of this stuff makes me seem like I do not like women and that's probably the worst message you can send to anyone of any gender. I don't like feeling like that and that's all probably in my head.
We're all PEOPLE. Of course our biology dictates lots of things. I like to think all people want the best for themselves and who they care about, and hopefully the world.
I'm really not sure what I'm getting at. These are some thoughts I've had for a while, but I really haven't understood how to articulate them well. But I hope I've added some dimension from someone who can identify, yet fully, here.