Meaning of calif.
Definition of calif.
(noun)
a
state
in the
western
United States on the Pacific; the 3rd largest state;
known
for earthquakes
Other information on calif.
WIKIPEDIA results for
calif.
Amazon results for
calif.
Tweet