10 Facts About the Wild West
The Wild West is a term used to describe the period of American history between the end of the Civil War in 1865 and the turn of the 20th century. A large number of people went to the western states during this time period in pursuit of fresh opportunities, as the American frontier was quickly … Read more