Definition of Wild West in English:

Wild West

Line breaks: Wild West
  • The western regions of the US in the 19th century, when they were lawless frontier districts. The Wild West was the last of a succession of frontiers formed as settlers moved gradually further west.

More definitions of Wild West

Definition of Wild West in:

Get more from Oxford Dictionaries

Subscribe to remove adverts and access premium resources

Word of the day kerf
Pronunciation: kəːf
noun
a slit made by cutting with a saw