What is the West?

Broadly speaking, what we call the West are the countries and peoples formed by the meeting of Greek philosophy, Roman law, and Hebrew religion. There’s a great deal of diversity within the West, but religion, ideas, art, literature, and geography set it apart from other civilizations.

Rod Dreher
Yes, They Really Do Despise Their Civilization
The American Conservative

Advertisements

Author: David Wolf

An adviser to corporations and organizations on strategy, communications, and public affairs, David Wolf has been working and living in Beijing since 1995, and now divides his time between China and California. He also serves as a policy and industry analyst focused on innovative and creative industries, a futurist, and an amateur historian.

1 thought on “What is the West?”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s