What is the West?

Broadly speaking, what we call the West are the countries and peoples formed by the meeting of Greek philosophy, Roman law, and Hebrew religion. There’s a great deal of diversity within the West, but religion, ideas, art, literature, and geography set it apart from other civilizations.

Rod Dreher
Yes, They Really Do Despise Their Civilization
The American Conservative

1 thought on “What is the West?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s