The western is a genre of film, television, literature, and art that emerged during the nineteenth century. The genre mainly focuses on the latter half of a period in American history often referred to as the Wild West. Stories and depictions are usually set during or after the events of the American Civil War, although the genre later took on more modern settings. The central figures of westerns are usually cowboys, or gunslingers, whose enemies range from Native Americans to greedy... więcej
Salem Press Encyclopedia, 2019. 4p.