Entry from US English dictionary
Definition of West Coast in English:
The western seaboard of the US from Washington to California.
- From Baja California up the US West Coast the climate is mild and arid, other than in winter.
- I coached a young assistant coach at a major basketball power on the West Coast a few years ago.
- That dam would make the West Coast self-sufficient.
What do you find interesting about this word or phrase?
Comments that don't adhere to our Community Guidelines may be moderated or removed.