Entry from US English dictionary
Definition of left coast in English:
The West Coast of the US, especially California: America’s left coast should be on everyone’s vacation list
Definition of left coast in:
- US English dictionary
What do you find interesting about this word or phrase?
Comments that don't adhere to our Community Guidelines may be moderated or removed.
Most popular in the US
Most popular in the UK
Most popular in Canada
Most popular in Australia
Most popular in Malaysia
Most popular in Pakistan