Why do I see everywhere around me women getting married, having a baby and continuing working as if nothing at all has changed in their lives? I see also women having babies with some guy that they don’t even stay with for more than six months after the child is born and then they go off to college to pursue a degree. A baby changes nothing in their lives. And if any of these women do become “stay-at-home mothers” they still spend most of their time trying to find ways to make money! And we really see nothing at all wrong with this? I was visiting a friend yesterday and they have a new baby. The baby isn’t even more than about two months old yet the mother is working full-time and always gone. And we see nothing wrong with this at all? They actually have several kids, the oldest no more than five years old and yet the mother is always gone working and using her college degree. It’s always been that way since day one. Her husband apparently sees nothing wrong with this either. Tell me why the hell we have young women having babies then going off to college when the kids are just a few months old and looking for family members to take care of the kids while they’re gone? Also, most don’t even marry their boyfriends either and yet nobody sees anything at all wrong with this?? What has happened to us as a society? Let’s get it straight. If you have kids they should be your top priority. Mothers should not be encouraged to go off to work or pursue an education while they dump their kids off with whoever will take them and neglect forming a real and true relationship (as in marriage) with their child’s father. Money is masculine. The making of money is masculine. Making money requires competition and putting oneself out there to face the world and achieve. Fatherhood should strengthen a man’s goal to make money and motherhood should weaken a woman’s desires to do so. OOPS I forgot I’m not allowed to say that! I’m just supposed to sit here and say “you go girl!” to any woman who makes it out there in a “man’s world.” Mothers are encouraged to pursue college degrees and paid employment and nobody is supposed to say anything about it at all. I’ve heard several men say that they would love for their wives to stay home but that they would never ask them to do so. I mean, come on!! Do we really see nothing wrong with this? Isn’t it time somebody said something?? Women need to make the home top priority again and men need to actually be real men again instead of some pansies who just go along with what their wives say instead of being leaders and taking charge. When it’s all said and done that college degree is going to be nothing more than a burden leading to nothing more than financial debt, wasted youth and an inability to have the life you really want.