However, the role of a woman in society has significantly changed during recent decades. The first positive changes connected with the role of the women became especially obvious after the World War I. Gradually people started realizing the importance of women in the society as it is. It’s safe to say that it didn’t happen easily.
Women always have a great role to play in any society. During the colonial times, a natural affinity was clearly evident between all black women who were deemed to resist the existing bonds of slavery that hampered them to succeed, and also many white women who were out to resent the male c.