It is the method by which we perpetuate culture, passing on beliefs, ideas, or customs from one generation to the next. Traditions form the basis of cultural identity, the mortar with which a family, a city, a country builds its personality. They are often unassailable bastions at the core of history for a people.

And that's why they're dangerous.

Traditions are often immune to challenge, merely by virtue of being "traditions": doing something "because that's the way we've always done it", whether it's who carves the turkey, how we set up the Christmas decorations, or how we venerate people in the past. And while culture is often an excuse, "the past" is really the key point here, because tradition looks only to the past and only with longing in its eyes.

The hidden assumption in tradition is this: the way things were done before is the best way and will always be the best way going forward. That assumption is quite often wrong. If a behavior is, in fact, the best method, then it should be able to stand on its own without being labeled "traditional"; if it is not, then no amount of reverence for history should prevent the best choice from being made. "Tradition" implies that a bunch of desert nomads from 2000 years ago could know what's best for people living in hundred-story high rises. It implies that a handful of rich white men from a few hundred years ago knew what was best for computer gamers and gun owners today. It argues that, because something was done in a specific way at some point in the past, that way is now a priori the best way and should never change.

As Toscanini said, "Tradition is just the last bad performance."

There are reasons to value cultural, historical or social traditions in the context of history, such as teaching traditional art forms, languages, or even rites. But those traditions should never be divorced from the time period in which they arose or their own cultural histories: why things were done a certain way, what the justification was, the effects of that tradition on future history, etc. Traditions should be valued as artifacts in the same way we value physical artifacts: as curiosities and points of reference, not as relevant for today's world. We can appreciate the role of ceremonial dance and religion in the political and social framework of pre-modern cultures without feeling the need to revere them, the same way we can appreciate an old flint knife without feeling the need to give up stainless steel or carbonite.

History shouldn't be ignored, but neither should it be placed on a pedestal and worshipped. Good practice and good information can stand on its own without arbitrary enforcement: if something is useful, it is useful whether it is old or new. We study Aristotle, Euclid, Newton and Freud not because of tradition but because of the inherent value of their statements and ideas, even if we've proven them "wrong" (or at least restrictive) in the intervening years: they are stepping stones along the path to modern logic, mathematics, physics and psychology. Just as we can appreciate Bach and Mozart without limiting ourselves to Baroque music and abandoning the Romantic period, we can appreciate the trappings of history without feeling the need to continue them and, instead, improve or even abandon them as need arises.

As we approach a holiday season that is often draped in traditions, don't be afraid to abandon them. If you find yourself sad or depressed, it may be because you're trying to cling to an outmoded idea of "should" that no longer fits - a tradition that is no longer useful. Embrace whatever celebrations or routines that you find useful, even if they aren't "traditional": go out instead of eating in, invite close friends to dinner instead of family, skip the presents and donate to charity or spend time at a soup kitchen - whatever it is that you feel is a better fit for you and your life today, not what you've been taught "should" be done.

Tradition looks to the past, but, while we must remember what has come before so as not to repeat the same mistakes, the past is not a model by which we can live. Progress exists only in the future, and with it comes the most powerful force we know - the thing that can forever destroy the historical, repetitive, traditional grievances of life:



Seeing isn't believing

"Observation" isn't a direct action.

We see things because light bounces off of them and reflects into our eyes (biological processes aside). We often think of this as "seeing the object", but the key here is that light itself is a thing, and it's really the light you're seeing: you have no direct interaction with the object. You only interact with it through the intermediation of light.

All observation takes place like this: whether it's photons (light), electrons, or anything else, the only way to "observe" something remotely is to bounce things off of it. Most of the time, this doesn't matter much: photons bouncing off of "normal"-sized objects don't do much to the object. When we're talking about atomic or subatomic levels, however, the relative sizes between the thing we're trying to observe and the particle we're using to "see" it are much closer. As such, "bouncing" anything off of them knocks them around, similar to pool balls on a table.

The result, as Werner Heisenberg determined, is that we can only take one measurement - one observation - of a particle at any point, because as soon as you take one (usually position or velocity), you change the particle's other properties. This has been formally labelled "Heisenberg's Uncertainty Principle", and it forever broke down the myth of the "impartial observer" - someone (usually a scientist) who could sit back and "just watch" without interfering with a process.

Heisenberg's was the first blow to the notion of the independent observer, but it wasn't the last. Quantum theory, as it has developed, has focused more and more on particles themselves existing in what we identify today as some kind of probability wave. In short, if you shoot an electron out of a tube pointed in a given direction, you can have a general idea about where the electron may be at any given moment but you can't be sure. That's not surprising in itself for most people; the part that is hard to understand is that for observed results to be correct, saying the electron is actually anywhere doesn't work. The uncertainty in its location isn't just a factor of us not knowing - it seems to be intrinsic to the particle itself.

Here's the classical experiment. We have an electron gun pointed at a wall with two slits in it; logically, the electron can only travel through one or the other slit. However, if we place a piece of electron-sensitive paper on the other side of the slits to see where the electron ended up, we get something odd: the pattern that results can only occur if the electron takes both possible paths, not just one or the other. This is true even if the electrons are fired one at a time with enough time for the first to hit before the second is released.

Let me restate that, because it's possible to miss the implication: in order to account for what we actually see happen, we have to assume that part of the electron - something indivisible under normal circumstances - passes through each slit. The notion that we can't know its position isn't just a factor of our not having the precision to determine it: according to results, the particle has no definite position until it reaches the paper and, it seems, has to take all potential paths to get there. Our best understanding of this is that it exists only as a probability, and that somehow those potential probabilities are what interact and cause the observed pattern.

It gets weirder: if we set up a system to watch the electron as it passes through the slits so that we can observe which one it goes through, the odd pattern requiring the probability explanation disappears; the result is explanable with classical physics models. So, it appears that the electron behaves like a single particle once we observe it, but until we do, it behaves like a set of probabilities. Furthermore, this is true not just of electrons but of every particle the process has been tested with: as near as we can tell, all matter exists probabilistally until observed and classically after observation.

Now, in all likelihood, what's occurring here is probably not and "actual" change in the nature of the particle but some kind of observational distortion similar to Heisenberg's Uncertainty; that being said, scientists have tried to disprove this and other weird aspects about quantum theory for years to no avail. Philosophers and religious types, of course, have taken this "elevation of the observer" to dizzying heights [pardon the pun] in an attempt to "scientifically" validate free will, all manners of Gods, or whatever unprovable phenomenon they prefer. The use of probability as a model for understanding quantum theory also led to Einstein's famous admonition that God "does not play dice with the universe"; unfortunate for the good professor, observational evidence seems to disagree.

Even if we discount the philosophical extensions, the increased awareness of observation and its participation in the process can be very useful. In "real world" terms, there are far more practical impacts to observation than simply photons kareening around like billiard balls: namely, the fact that everything we observe not only has to pass through intermediary objects and multiple analog processing systems, but that it then has to be interpreted by us. Bringing into focus how our biology and psychology can affect our observations (and, thus, our interactions with and even thoughts about everything around us) is the focus of general semantics, but that's for another day.

For now, just try to keep in mind that you aren't really reacting to the world - you're reacting to reactions to the world, and even then through a layer of interpretation.



Written Chinese is a pictographic language: the symbols are representational, with each depicting a full word or concept. For example, the word for "island" was, originally, a picture of a mountain in the water where birds land; the current version has been "blurred" by the use of brush instead of stylus, but you can still make out some of the concept.

What's important here is that, with pictures, you can't really denote tense - the desgination of whether something already happened, is happening, or will happen in the future. As such, written - and even spoken - Chinese has no real tense for any verb. If you want to say you went to the doctor yesterday, you quite literally say, "I go to the doctor yesterday."

Chinese is of course not the only language that has this issue - and English has plenty of problems on its own. While these differences go a long ways towards explaining the odd syntax foreigners use in non-native languages, there are other implications.

Scientific studies have shown that, if your language doesn't have a word or concept for something, it is much more difficult - if not downright impossible - for you to think in terms of that thing. Colors and musical scales are easy examples: anyone used to "western" music will have a very hard time differentiating all the notes in Indian music. People joke that Innuit have 50 words for "snow", but what they really have is 50 different things they can identify which, to us, are all simply "snow": we lack an awareness of differentiation that they can make.

Most people I know look at a block of code and simply say, "Oh, it's a program." However, I differentiate between: markup, interpreted, and complied code; imperative, procedural, object-oriented, and functional languages; etc. I have words for all of these, and thus I also have concepts behind the words.

And, having the word and the concept influences other aspects. Philosophies in China (and similar Asian culturs) tend towards timelessness, reincarnation, and "balance" between action and non-action - all of which makes sense when we consider that their language doesn't easily differentiate between past, present, and future. If I can't describe the differences between tenses, won't my phrasing - and eventually my thought pattern - reflect that kind of timelessness?

I've heard one definition of "genius" which states it as the ability to conceive of something that one doesn't have the language to describe. Many of us have invented words at times, trying to differentiate something that heretofor hasn't been differentiated. Sometimes it's a matter of nuance, and sometimes it's simply a concept that doesn't exist in our native language. A recent addition to the English language (beyond mere technical or "slang" terms) is the concept of a "meme", defined by Richard Dawkins.

This is probably the best argument for studying multiple languages - and not just ones with a common root. If you speak English, learning German - or even a romance language - doesn't give you nearly as much as learning Greek, Chinese, or Russian. At the same time, English has become the international language not just because of the power of English-speaking countries like the US but because of its amazing breadth: English has adopted/stolen/incorporated more words and concepts from other languages than any other language in existence. It also has more tenses than any other, and so one can make more precise statements in English than in any other language - a major benefit in technical and scientific fields.

And, yet, any language still has its limits. My favorite example of a concept which does not exist in English is that of the Greek arete: it's been loosely translated as "virtue" or "purpose", but "essence of being" would also be close. None are exact, though; it's a concept which only exists in abstract association in English, and yet was so intrinsic to Greek culture that it is the reason for the Olympics.

The important factor to remember is this: even the language you speak can frame the concepts of which you can think. It's important to try to work beyond linguistics to more abstract thoughts that cannot be easily converted into language. Who knows - you might even get to make up a new word.