Westcentrism
noun
noun ·Rare ·Advanced level
Definitions
Noun
- 1 The practice of viewing the world from a Western perspective, with an implied belief, either consciously or subconsciously, in the preeminence of Western culture. uncountable
Etymology
From West + -centrism.
More for "westcentrism"
Data sourced from Wiktionary, WordNet, CMU, and other open linguistic databases. Updated March 2026.