Wild West


noun

the western frontier region of the U.S., before the establishment of stable government.

Origin of Wild West

An Americanism dating back to 1850–55

Example sentences from the Web for wild west

British Dictionary definitions for wild west

Wild West

noun

the western US during its settlement, esp with reference to its frontier lawlessness