Western
Meanings
- A film, or some other dramatic work, set in, the historic (c. 1850-1910) American West (west of the Mississippi river) focusing on conflict between whites and Indians, lawmen and outlaws, ranchers and farmers, or industry (railroads, mining) and agriculture.
- Of, facing, situated in, or related to the west.
Wikipedia Articles
- Western: Look up Western or western in Wiktionary, the free dictionary. Western may refer to: Western, Nebraska, a village in the US Western, New York, a town
- Western world: The Western world, also known as the West, primarily refers to various nations and states in the regions of Western Europe, Northern America, and Australasia;
- Western Europe: Western Europe is the western region of Europe. The region's extent varies depending on context. The concept of "the West" appeared in Europe in juxtaposition