Traditionally, women have been associated with the “home” and men with the “world”. In many ways, the coming of the American Civil War of 1861-65 challenged the ideology of Victorian domesticity that had defined the lives of men and women in the antebellum era. Women’s role in the society changed drastically as they gained opportunities socially and economically, and took on positions of power and responsibility. The Civil War brought about changes in the women’s lives both during its course and