Meaning of west
Definition of west
(adj)
situated
in or
facing
or
moving
toward the
west
(adv)
to, toward, or in the west; "we
moved
west
to Arizona"; "situated
west
of Boston"
(noun)
the countries of (originally) Europe and (now including) North America and South America
the
cardinal
compass
point
that is a 270 degrees
the
region
of the United States
lying
to the
west
of the Mississippi River
the
direction
corresponding
to the
westward
cardinal
compass
point
British
writer
(born in Ireland) (1892-1983)
United States
film
actress
(1892-1980)
English
painter
(born in America) who became the
second
president
of the Royal Academy (1738-1820)
a
location
in the
western
part
of a country, region, or
city
Other information on west
WIKIPEDIA results for
west
Amazon results for
west
Tweet