West Definition
The Western world; the regions, primarily situated in the Western Hemisphere, whose culture is derived from Europe.
The western states of the United States.
To move to the west; (of the sun) to set. [from 15th c.]
Idioms, Phrasal Verbs Related to West
- the West
Origin of West
-
From Old English west, from Proto-Germanic *westą. Compare West Frisian and Dutch west, German West, Danish vest.
From Wiktionary
-
Middle English from Old English wes-pero- in Indo-European roots
From American Heritage Dictionary of the English Language, 5th Edition
Find Similar Words
Find similar words to west using the buttons below.