• Link Centre - Search Engine and Internet Directory

Dictionary definition for: West

1. (n) the countries of (originally) Europe and (now including) North and South America

2. (a) situated in or facing or moving toward the west

3. (r) to, toward, or in the west; "we moved west to Arizona"

4. (n) the cardinal compass point that is a 270 degrees

5. (n) the region of the United States lying to the west of the Mississippi River

6. (n) British writer (born in Ireland) (1892-1983)

7. (n) United States film actress (1892-1980)

8. (n) English painter (born in America) who became the second president of the Royal Academy (1738-1820)

WordNet 2.1 Copyright Princeton University. All rights reserved.