The West too has a ‘rape culture’
Rape culture is a concept in feminist research, which explains the prevalent attitudes, norms and practices in a society that trivialises, excuses, tolerates, or even condones rape. It “is a complex set of beliefs that encourages male sexual aggression and supports violence against women”, as defined in the Encyclopedia of Rape. Read More
Next Story