For centuries, men have been the ones with the power in societies across the world. Last century, women started to acquire power in the west. But to this day, this is still a Man's world. True or False? Why do so many men feel that this is not true? What do you think about this? Are men oppressed?
I took the stance that men aren't oppressed, though certainly they are harmed in important ways. Certainly, there are some areas where women are in advantage or even dominate, and some laws benefit women more than men. But, in general, men have more power compared with women, even in western societies.
I took the stance that men aren't oppressed, though certainly they are harmed in important ways. Certainly, there are some areas where women are in advantage or even dominate, and some laws benefit women more than men. But, in general, men have more power compared with women, even in western societies.