At its core, feminism is the belief in full social, economic, and political equality for women. Feminism largely arose in response to Western traditions that restricted the rights of women.

In the pre-historic times, the roles/occupations were decided on the basis of functioning of a human. Men being physically stronger than women went outside to hunt and gather food, while women who were emotionally stronger and were more organized took care of the family and the household work. There was no discrimination, the roles were thus assigned as per one’s capabilities.

In the historic times, when people began to settle, when societies were made, the tradition of men going outside the home and women staying at home was mandatory. Women were not allowed to study and were only limited to the house itself. There mere role was to look after the family, and might be were considered only as a subordinate to men. The society was dominated by the men. Almost everything that would happen was decided by men.

Even in the modern times, at certain places in the world, preference is given to males over females, females are not even allowed to compete and show their potential.

Therefore it is necessary for people to understand the actual meaning of feminism where people (not just a specific gender) advocate and fight for the equal rights for women as well. If and only if equal opportunities are provided to people, then only the worthy and strongest contender can come up front.

In order for the world to be at peace, the society must be egalitarian. No one should ever feel that they didn’t get a chance not because they were incapable or incompetent but were not allowed. The world has improved than as it was in the historic times, but still a lot more has and needs to be done.

This will not take place by itself or just by women fighting for themselves. Men have to be an equal part of the movement, for everyone must be socially, economically, and politically equally recognized.

Checkout more such content at: