Western world

From Wikinfo
Jump to: navigation, search


For alternative meanings for "The West" in the United States, see The West (U.S.) and U.S. Western states.

The term Western world can have multiple meanings depending on its context. Originally defined as Europe, most modern uses of the term refer to the societies of Europe and their genealogical, colonial, and philosophical descendants, typically also including those countries whose ethnic identity and their dominant culture derive from European culture.

Western countries

To explain what is typical of Western society and Western culture, we must first define what constitutes the West (also called the Occident). Which countries belong, and which don't? Historically, the definitions have varied.

Historical

The Hellenic division between Greeks and "barbarians" (a Greek word), predates the division between East and West. The contrast was between Greek-speaking culture of mainland Greece, the Aegean, the Ionian coast and Magna Graecia in southern Italy, and the surrounding non-Greek cultures of Thrace and Anatolia, the Persian Empire, Phoenicians and Egypt. This contrast can be traced in the Trojan War, which is dated traditionally to 1194 BC - 1184 BC. Presuming it had a historical basis, the conflict was between Achaeans and the non-Greek Trojans in western Anatolia. The Greeks also considered the Persian Wars of the early 5th century BC a conflict of west versus east.

When the Roman Emperor Diocletian divided the Roman Empire into two regions, each administered by a Caesar (Tetrarchy), in 292 A.D., the eastern part evolved into Byzantine empire, a Christian theocracy where the emperor was head of the spiritual life as well ("caesaropapism"). At the same time, Roman polity in the western part crumbled under pressures from outside the empire and was slowly rebuilt as a culture divided between two sources of power, the pope and the Emperor.

The Eastern part of the Mediterranean world, though Christian, was contrasted with the West all through the Middle Ages. "Latins" and "Franks" sacked Constantinople in 1204 during the Fourth Crusade as ruthlessly as any alien culture. Only with the rise of the Ottoman Turks as a non-Christian contrast did the Byzantine "East" become to some extent (largely in retrospect and largely by non-historians), part of the "West." (Compare the concept of "Christendom".) As a result of its Byzantine heritage, Orthodox Europe, including Russia, may or may not be considered part of the West.

Expanded

During the early 16th century, explorers and conquerors like Christopher Columbus, Hern�n Cort�s and several other conquered new continents on behalf of the Western nations. Up until the 19th century, Europeans settled new lands and thus the term "Western" came to encompass nations and former colonies such as the United States, Canada, Australia, New Zealand, etc. populated mostly by European-descended Caucasians.

Japan in 1955, (immediately after its occupation by the US) would be considered by most to be part of the West - while Japan in 1750 would not. Similarly, North America in 1850 would be considered part of the West while it would not be in 1450, or 1500, even - before substantial colonization had occurred.

Cold War

During the Cold War, a new definition emerged. The Earth was divided into three "worlds", numbered "1st, 2nd and 3rd". The first were NATO-members and other nations aligned with the United States. The second world were the Eastern bloc nations in the Communist sphere of influence, such as the Soviet Union, People's Republic of China, etc. The third world were nations unaligned with either. Hence, the Western world became a synonym for the first world.

Post-Cold War

After the end of the Cold War, the phrase "second world" fell into disuse, and "first world" came to refer to the democratic, capitalist, wealthy, industrial, developed nations, which had been characteristic for most nations aligned with the US. The "third world" came to refer to the poor, unindustrialized developing nations. That is, the term "Western" is not so much a geographical definition as it is a cultural and economic one, therefore:

  • African history can speak of Western influences by a group of small countries that lie to its north.
  • Australia can be considered a Westernized country located in the East.
  • International companies founded in America may be considered foreign influences in Europe, but be said to be Western when their presence is seen (and sometimes criticized) in the Orient.

Nowadays, people differ in their definitions of the West, and different definitions overlap only partly. There are certainly non-Western developed nations, not all Western countries are members of NATO, etc.

Oftentimes use of the term "The West" was motivated by racist attitudes towards Slavic Europeans, in that the term was not encompassing of them whereas "Europe" is.

In the Near East or Middle East, (both terms relative to Europe as being in the west), the distinction of Eastern and Western Europe is of less importance, so countries that we might speak of as part of Eastern Europe, i.e. Russia are counted as Western when speaking about the general cultural of Europe and Christianity. But the line between East and West doesn't move any further East, even when contrasted with China.

Current

Depending on context, the Western countries may be restricted to the founding members of NATO, the European Union (EU) and Switzerland. A broader definition might extend to Australia and New Zealand and sometimes Israel.

The Asian countries Japan, South Korea, the Republic of China (Taiwan), are sometimes considered part of the West and sometimes not.

Latin and South American countries are sometimes considered part of the West and sometimes not. Mainland China, the remainder of the Middle East, India, and Russia are generally not considered part of the West.

One should distinguish "Western society" from the socio-economic term "first world" in that, for example, South America is sometimes mentioned as a Western society, but much of it is poor.

The term The North has in many contexts replaced earlier usage of the term "the west", particularly in the critical sense. It is a little more coherent, because there is some absolute geographical definition of "northern countries", and this distinction statistically happens to capture most wealthy countries (and many wealthy regions within countries).

More typically, the term "The West" contains a pejorative meaning - simply to describe and deliniate the wealthy and dominant societies from the poorer societies - those who are believe they are subjugated economically, miltarily, and otherwise, by deliberate restraints placed on them by the wealthier ones. "The West" then becomes simply a term to mean: "Wealthy, Colonial, Europe-descended (or allied) societies."

Western life

Western countries have in common a high (relative) standard of living for most citizens - compared to the rest of the world. They may also have democratic, (mostly) secular governments, rule of law and developed bodies of laws that have some expression of rights (for its own citizens) in law. Also, high levels of education, and a similar, "modern" popular culture may reflect the Western or Westernized society. Militarily and diplomatically, these "Western" societies have generally been allied with each other to one degree or another since World War II. In fact, some would argue that this is the definition of the West and explains why Japan is usually considered Western while Ecuador is not.

Western thought

The term Western is usually associated with the cultural tradition that traces its origins to Greek thought and Christian religion. (See Western culture.) Cornerstones in this tradition are arguably:

Western society may follow a chain beginning with the philosophers of Athens such as Solon and Socrates. It continued through the Roman Empire and, with the addition of Christianity (which had its origins in the East), spread throughout Europe. During the colonial era, it became implanted in the Americas and in Australasia.

In the early 4th century, the Emperor Constantine the Great established the city of Constantinople as the capital of the Eastern Roman Empire. The Eastern Empire included lands east of the Adriatic Sea and bordering on the Eastern Mediterranean and parts of the Black Sea. These two divisions of the Eastern and Western Empires were reflected in the administration of the Christian Church, with Rome and Constantinople debating and arguing over whether either city was the capital of Christianity (see Great Schism). As the eastern and western churches spread their influence, the line between "East" and "West" can be described as moving, but generally followed a cultural divide that was defined by the existence of the Byzantine empire and the fluctuating power and influence of the church in Rome. This cultural division was and is long lasting; it still existed during the Cold War as the approximate western boundary of those countries that were allied with the Soviet Union.

There are ideals that some associate with the West, and there are many who consider Western values to be universally superior. The author Francis Fukuyama argues that Western values are destined to triumph over the entire world.

However, there are many who question the meaning of the notion of Western values and point out that societies such as Japan and the United States are very different. Furthermore, they point out that advocates of Western values are selective in what they include as Western; usually including for example the concepts of freedom, democracy, and free trade, but not Communism and Nazism, both of which began in the West, or slavery, which reached massive levels in the West, and whose history in the West goes back millennia. Therefore by selecting what values are part of Western values one can tautologically show that they are superior, since any inferior values by definition are not Western. See also: No true Scotsman fallacy

A different attack on the concept of Western values is advocated by those who advocate Islamic values or Asian values. In this view, there are a coherent set of traits that define the West, but those traits are inferior and are usually associated with moral decline, greed, and decadence. These are concerned with the Westernization of the rest of the world.

Since the countries in the "West" were generally those that explored and colonized outside of Europe, the term Western became, to some people, associated with European colonialism. However, many others have established colonial rules, so it is not uniquely a Western phenomenon.

Historically, one of the interesting questions is how did the societies associated with "the West" come to dominate the world between 1750 and 1950.

See also


References