tag:blogger.com,1999:blog-20682199125331702212023-11-15T22:51:45.077-08:00Out Not UpWhen you look at the stars, you're looking out not up - change your perspective and change your world.Unknownnoreply@blogger.comBlogger42125tag:blogger.com,1999:blog-2068219912533170221.post-87129552343621388552017-05-08T12:32:00.000-07:002017-05-08T12:32:02.515-07:00Stuck in the Middle with YouWe here a lot about the "middle class" in European-modeled societies, but most people lack a real understanding of what the middle class actually is.<br />
<br />
The easiest and perhaps clearest method for defining the middle class is by exclusion: e.g., what is *not* the middle class?<br />
<br />
We generally consider a modern society to have three classes: upper, middle, and lower.<br />
<br />
The upper class is generally thought of as the "leisure class", or "the rich", or variations thereof. However, the primary characteristic of the upper class from a sociopolitical standpoint is this: their day-to-day livelihood and stability is ensured passively rather than actively. Their lives are not subject to the whims of other or even dependent on their own labor or activities.<br />
<br />
To phrase it another way, those in the upper class generate incomes based on the efforts or workings of others: through investment, through ownership, through inheritance, etc. A person in the upper class need not do anything (or much) to maintain their lifestyle. Income is entirely passive. They can (and often do) choose to work in positions that interest them or provide some other benefit, but their core livelihood is ensured, for the most part, regardless of what they do. If a rich person decided to stay in bed all day (and a few have been rumored to do so), they would still be rich and continue to be rich.<br />
<br />
The lower class - the poor - is largely marked by constant effort and vigilance to maintain living - not a lifestyle, but the actual necessities of day-to-day life. If a person in the lower class stops working, stops striving, stops putting out the effort - even for a short period - they often suffer major consequences with regards to their personal safety, security, health, or other basic needs.<br />
<br />
The second characteristic of the lower class is high dependence on others: the state, employers, charity, etc. - as part of that struggle. Any interruption of support from those sources, all of which are outside the control of the individual, can also cause serious or disastrous personal consequences for a member of the lower class. That "support" can often be indirect; technically, a tenant is depending on the whims of a landlord in many cases in order to maintain their housing security.<br />
<br />
To summarize: the upper class has (near-)total control over its basic needs and must expend little to any effort to maintain those needs, while the lower class has limited control over its basic needs and must expend nearly all of its effort to maintain them and/or live in a state of near-total dependence on forces outside of its control.<br />
<br />
What, then, is the middle class?<br />
<br />
The cheeky answer is "somewhere in the middle", but it's also the correct answer. The middle class must expend some, but not all, of its efforts towards its basic needs. It is less dependent on <i>but not independent of</i> others, a dependence most often in the form of employment.<br />
<br />
From a simplified economic standpoint, we can consider the lower class as "those whose net worth is entirely equated with labor", the upper class as "those whose net worth is entirely separate from labor", and the middle class as "those whose net worth is a combination of labor and non-labor."<br />
<br />
If we think about what we generally consider to be characteristics of the middle class, these definitions - and especially the economic distinctions - make sense. The middle class works for a living but can afford to take time off for vacation, meaning that day-to-day survival is not dependent on day-to-day labor. The middle class generally starts to accrue assets - car, home, retirement funds, maybe a small investment account; these assets are thus divorced from labor itself and can even constitute a small source of income/increased net worth that is labor-independent. The middle class also strives to ensure inheritance of some of those accrued assets, further enabling a separation of survival/net worth from labor.<br />
<br />
This isn't to say that any of this is "right" in any ethical or moral sense; a large portion of leftist-progressive politics is based on how to separate labor from needs for the lower (and to a lesser extent middle) class, while right-conservative politics often looks to reduce dependence in the same groups. But understanding where the distinction actually lies may help shape these discussions.<br />
<br />
For example, in the modern US, there is a lot of discussion about the "shrinking middle class"; this framing helps us understand exactly what that means. In short, it is an increase in the dependence that middle-class individuals must place on their labor or on sources outside their control to meet their basic needs; living "paycheck to paycheck" is a prime example of this, regardless of why someone is doing so. It's also a decrease in the likelihood of accruing real assets; for most families, a home is the primary asset, but fewer and fewer families can afford to own homes.<br />
<br />
Another aspect of this shift is a change in dependency: historically, the middle class was dependent most on employers, in the form of healthcare, wages, pension, and other benefits. Many of those benefits have either disappeared entirely or are actively shifting out of the employer/employee relationship, with the result that those in the middle class are being left without that source of support. Again, we can discuss whether separating retirement plans from employment (via 401(k), IRAs, etc.) or healthcare from employment (via single-payer or a hybrid like the PPACA) is morally or ethically good in the long run, but it is inarguable that this relationship that in the past helped to build the middle class is deteriorating.<br />
<br />
One clear example is simply the notion of company loyalty: It's now considered extremely odd to have only worked for one or two employers in one's career, whereas that used to be the standard. 40 years ago, an employee started at a company in their 20s and often retired from that company 30 years later. Today, the notion of loyalty from a company towards an employee (and, arguably as a consequence, from employee towards a company) has largely vanished; in some job sectors, working for the same organization for more than 24 months is considered a troubling sign on a resume.<br />
<br />
As to why the middle class is considered so important in a society, that should be addressed elsewhere. Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-2068219912533170221.post-1897145569783132662017-01-27T09:03:00.000-08:002017-01-27T09:03:38.813-08:00#resist<i>A scientist makes observations through a telescope and publishes his findings. They are somewhat controversial, accepted by some and challenged by others. This does not deter him, as he is simply reporting the facts as they are observed.</i><br />
<br />
<i>The ruling authority gets involved, as the facts he has reported directly contradict what the authority believes and wants the population to believe. They chastise and harass him, trying to force him into retracting his ideas.</i><br />
<br />
<i>Instead, he publishes a book - ostensibly with the authority's permission - comparing his observations and theories with what the authority proposes, in language the average reader could understand. He presents his facts, rebuts the arguments of his detractors, and makes his case.</i><br />
<i><br /></i>
<i>The authority's response is immediate and total. Within a year, his books are banned from publication anywhere. His theories are verboten, and discussing them can lead to similar punishment. The authority must not be contradicted.</i><br />
<i><br /></i>
<i>He is sentenced to house arrest for the remainder of his life.</i><br />
<br />
This isn't the plot of a drama or movie. This is a brief version of the life of Galileo Galilei. That book, known as <i>Dialogo</i>, wasn't taken off the banned list for almost 300 years.<br />
<br />
One of the most famous diagrams from <i>Dialogo</i> depicts the Copernican model of orbit (Copernicus himself never suffered for his theory, as he died on the eve of its formal publication). That page is shown below:<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEifbTuTn23YEymNs_GMDHC2dUpJcS0RNhIVZZ-rtXbC4H7Bq_C9KwYwQK347Dh8kTV2cP8yCyxIkBRBj5ZgmCRZMrKEyyvaIxaWKtOIXrtCPHAcW1dkYMJ0alfi_T4gfbUgATwQ03_sjeI/s1600/dialogo.JPG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEifbTuTn23YEymNs_GMDHC2dUpJcS0RNhIVZZ-rtXbC4H7Bq_C9KwYwQK347Dh8kTV2cP8yCyxIkBRBj5ZgmCRZMrKEyyvaIxaWKtOIXrtCPHAcW1dkYMJ0alfi_T4gfbUgATwQ03_sjeI/s400/dialogo.JPG" width="265" /></a></div>
Galileo's crime was speaking the truth to an authority - the Catholic
Church, in this case - that disagreed: Copernican theory was considered heresy, as it contradicted the Earth-centric Biblical view the Church officially supported. In 400 years, the facts haven't changed, and
many of Galileo's arguments and theories as presented in <i>Dialogo</i> hold up today, but he died a prisoner because he refused to deny the reality of the world.<br />
<br />
America
isn't at that point - yet. The current administration certainly seems
headed in that direction, with blatant denial of facts, muzzling of the
organizations that are responsible for publishing facts, and (it
appears) a process for political review of scientific information before
it can be released.<br />
<br />
Many people, especially
scientists, believe optimistically that facts are their own defense -
that the truth can stand on its own against lies and deceit. We have
already witnessed that this isn't the case. It is the duty of every
rational person to support the truth and deny any attempt at burying or
destroying it. The truth needs its defenders. Science needs and army.<br />
<br />
In the last week and in response to the administration's decision to silence all public communication from many government science organizations, members of those organizations set up alternative media accounts to continue getting the truth out. Right now, they are collections of individuals, and individuals are vulnerable. There needs to be a movement, and movements need something to stand for.<br />
<br />
I say that we follow Galileo and refuse to concede the truth, no matter the cost. Science needs an army, and I for one am willing to fight.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhI4uTjR0uE3wxuPAPA900Xqp6kOVFaeFCEiTI5Lx2bhOp4A-B_c3BJgiCGcLr4wUI2OO0U-PqrxiE99m1JBZRsUu_Wo0jHTJSFMoNqI7oFKrt8vLOEK4Z9-BPxGohma7-074XJrz7eQRg/s1600/resist.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhI4uTjR0uE3wxuPAPA900Xqp6kOVFaeFCEiTI5Lx2bhOp4A-B_c3BJgiCGcLr4wUI2OO0U-PqrxiE99m1JBZRsUu_Wo0jHTJSFMoNqI7oFKrt8vLOEK4Z9-BPxGohma7-074XJrz7eQRg/s320/resist.png" width="229" /></a></div>
<br />Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-2068219912533170221.post-39299863214824395432015-12-22T09:33:00.001-08:002015-12-22T09:33:32.809-08:00The sexiest picture I've seen in years<div class="separator" style="clear: both; text-align: center;">
<a href="https://pbs.twimg.com/media/CWzUJbpUkAEBOtD.jpg:large" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="266" src="https://pbs.twimg.com/media/CWzUJbpUkAEBOtD.jpg:large" width="400" /></a></div>
<br />
That's Falcon9_Mission20 first stage - after having reached 100 km, 5000 km/h, 16 km downstream, separating the second stage, turning around, and flying back - landing, vertically and under power, at Cape Canaveral almost exactly on the target.<br />
<br />
As the first stage is approximately 70% of the cost of launches, this is a very, very big deal for space flight. With the upcoming launch of the Falcon Heavy - which has two of these as "boosters" around a third segment, all of which can potentially be landed the same way - we may actually see affordable major development in space. Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-2068219912533170221.post-16465602137465921012014-10-21T15:14:00.000-07:002014-10-21T15:14:07.353-07:00The choice of rightMorality is fundamentally irreligious.<br /><br />I don't mean that religious people can't be moral. I simply mean that the basic fundamental of morality has nothing to do with religion.<br /><br />It is, in fact, entirely aesthetic.<br /><br />Many people like to go on about their reasoning or rationale for various moral choices - whether religious or not: there's a major topic in atheism about building a rational morality. But all of these systems come down, at one point or another, to considering life to have value.<br /><br />And therein lies the sleight of mind that most people miss.<br /><br />If you think life is valuable or important because your god says so, with no personal involvement, you aren't really moral - you're obedient. You've made no personal decision about right and wrong and merely follow orders.<br /><br />Even in that situation, though, I know of no active religion that doesn't have some element of self-determination - of freedom of choice - within it. And if there is choice, then there is the ability of the individual to decide whether or not to follow the dictates of the religion. What's the basis of that choice? Why does someone choose to follow the rules vs. disobeying them?<br /><br />It's an aesthetic decision: an expression of personal taste or preference. It's not justifiable in any way, no more than preferring the color blue or the taste of chocolate. That aesthetic decision is, really, the fundamental moral act for the religious.<br /><br />For others (like atheists), the choice is a bit more direct: rather than adding the layer of choosing to value life (or not) because some deity does, they simply make the choice on their own. Again, it's often couched in vague or loose terms, and rationalizations are often presented if not simply the assumption of self-evidence ("of course life is valuable; everyone knows that!"). But it's still, at its core, an aesthetic choice.<br /><br />Once you've made that choice - or, if you're like me, and drill down a little deeper and make it there - you can use whatever framework of logic or reason to build up the rest of your morality. Or, if you're religious, you just adopt the framework of a religion and can avoid spending much time analyzing the details (or do, if you feel like it). One can build a rational or religious (or both) morality upon the basic aesthetic decision, but it's impossible to build either without that aesthetic decision.<br /><br />If you believe that there is no such choice - that it is impossible to not choose to value life or whatever you see as the basis of morality - then you're back to mechanical obedience and morality cannot exist, whether it's obligatory obedience to a god or forced behavior due to physics and chemistry.Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-2068219912533170221.post-42201859158957295352014-06-17T13:49:00.000-07:002014-06-17T13:49:38.629-07:00Hiding in plain sightI like taking pictures. I'm not a professional photographer, but it is a hobby I enjoy.<br />
<br />
Someone meeting me on the street wouldn't know this. Most of my coworkers don't even know this. Heck, many of my friends haven't seen my photos. Every time the subject of photography comes up, I have the option of mentioning that I do photography. Occasionally, someone else who knows will say it for me.<br />
<br />
Now, photography isn't really a controversial topic (unless you're in a media class), so the question of whether or not to out myself as a photographer isn't really important to anyone but me.<br />
<br />
(And yes, you may see where this is going, but bear with me.)<br />
<br />
There are many facts about ourselves that we choose to reveal or conceal in different environments, most without impact beyond ourselves. The choice is (mostly) ours on how much we wish to include others in our lives and how much of our lives we wish to include in our relationships with others, be they coworkers, friends, family, lovers, or anything else.<br />
<br />
Some facts, however, are far more controversial. Sexuality is an obvious one, and one with which I'm very familiar, but it's not the only example: mental or non-visible physical differences, disabilities or disorders are another (do you tell someone you have a hearing aid? that you take prozac every morning? that you have a sixth toe?); another might be transgenderism. Even things like marital status or level of education can be points of conflict in different environments.<br />
<br />
With respect to sexuality, there are active social forces in many places including the United States that are very against anything other that basic man/woman relationships. These forces are (at least in the US, but I see it elsewhere as well) losing power slowly, but they remain major sociopolitical factors for the moment.<br />
<br />
There's an argument often made, especially by individuals in the gay community, that anyone who is gay<i> ought</i> to come out: it's often framed as a kind of debt, that such people <i>owe it to society</i> to be visible and proud and such. The argument is legitimate: the more gay persons are visible in society as regular, everyday people, the harder it is for the opposition to demonize the group as a whole.<br />
<br />
I have a couple of problems with this.<br />
<br />
The first is an emotional/boundary problem: it's <i>my</i> sexuality. It's part of who I am, as an individual. To whom I express or reveal that part, and how much, and when, is entirely a decision that only I can and should make. No one has a right to force me out of the closet (no, not even if I'm a raging hypocrite - sexuality is not a weapon to use against someone), even with the best intentions.<br />
<br />
The second problem, though, is one that goes back to being a photographer. It doesn't matter how many times I tell people I'm gay: there's always someone new coming along who doesn't know. We often speak of "coming out" like it's an event, a touchdown or a party (or a disaster), but it's not. It's a process that has to be repeated over and over and over. <i>Coming out never ends</i>.<br />
<br />
Do you realize how much of a burden is being placed on someone by insisting they <i>must</i> come out? For someone like me, who isn't "obviously gay" and is regularly assumed to be straight, should I wear a button that says, "Hi! I'm gay!" everywhere I go? Do I need to preface every initial meeting with someone with a detailed description of my sex life?<br />
<br />
Even for me, it's a burden. I frankly don't care if people know that I'm gay - no more than I care if they know I'm a photographer. I'm not ashamed or hiding the fact. But the effort it takes to constantly be coming out, to constantly be answering the questions that everyone asks (and they're always the same questions), is huge. That effort is more than I care to make most of the time; most relationships just aren't worth it.<br />
<br />
And it's just as true, if not more so, for all those other hidden facts about people. I don't tell everyone I know that I'm ADHD, or dyslexic, or OCD, or slightly sociopathic, or allergic to alcohol and strawberries, or... All of these things come with an added degree of effort to maintain any relationship in which they're mentioned.<br />
<br />
I understand the sociological context of the "come out, come out, wherever you are" mentality. And I very much want the world to be a place where people feel they are safe to come out, whether it's about being gay or diabetic or transgender or Autistic or even a photographer; that "want" bears with it a social burden that must be born by someone in society. But I don't think that anyone - <i>anyone</i> - has an <u>obligation</u> to come out in <i>any</i> scenario, nor that anyone has the right to "out" someone else in <i>any</i> scenario (with the possible limitation of something life-threatening).Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-2068219912533170221.post-91570433711749504402013-12-19T13:36:00.001-08:002013-12-19T13:36:29.132-08:00Stop the harmEvery decision to act is a cost/benefit analysis, even if we aren't aware of it consciously.<br />
<br />
The difficulty comes in deciding what the actual costs and benefits are, and how to quantify them relative to each other. What is the cost of a broken heart today versus one in three months? How do we measure the benefit of a random smile?<br />
<br />
One thing most people agree on is that a cost born involuntarily is worse than one born voluntarily; in practical terms, intentionally or accidentally forcing a cost on someone is ethically worse than intentionally or accidentally taking one on yourself.<br />
<br />
Applying this logic to incidents of violence is generally fairly simple: if you see a fight, you try to stop it and prevent anyone - assailant or victim - from being hurt, even if you don't know what they're fighting about. Now, we have a legal system that is moderately successful, so we have the advantage of knowing that even if the attack is somehow justified, stopping it likely won't stop the eventual justice. Even without that, however, it simply makes the most sense to <i>stop the harm</i> first and worry about the rest of the factors afterwards. <br />
<br />
When we try to apply this same logic to incidents of discrimination or bigotry, however, that basic calculus seems to get lost somewhere. We often hear that we should work on helping the perpetrator with education or debate, being "nice" to them rather than chastising or confronting them. Rather than helping the victim, the arguments often change to minimizing the harm to the perpetrator.<br />
<br />
If person X is doing something that harms person Y, it shouldn't matter if the harm is a physical attack or an discriminatory act: the ethical responsibility is to first stop the harm to person Y. Once that's done - once person Y is no longer being involuntarily subject to harm because of X's actions, regardless of what those actions are or how X is being affected by those actions - we can then look at X's situation and try to determine the most ethical course of action with them. <i> The harm has to stop first.</i><br />
<br />
If we neglect this duty, if we allow harm to continue to be inflicted on victims while trying to somehow make the act of stopping the harm more palatable or less inconvenient to the perpetrators, we are complicit in the causing of the harm. We're no better than the perpetrators.<br />
<br />
In simple terms: you don't ask the guy attacking people with a knife if he has a bad relationship with his mother or why he's doing it. You tackle him, disarm him, and <i>then</i> worry about his motivation. The same is true if the knife is, instead, language, power, or privilege.Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-2068219912533170221.post-58574407049481297052013-04-08T14:45:00.000-07:002013-04-08T14:45:25.894-07:00The Yellow ClarinetThe illusion of control is a very important thing.<br />
<br />
How many superstitions about bad luck are there? The number 13, walking under a ladder, breaking mirrors, real flowers on stage - there are numerous (probably too many to count) variations on what should or shouldn't be done to influence the luck on has. Some of them seem kind of obvious - walking under a ladder doesn't seem terribly brilliant to start with, as something or someone might fall off, but I'm not sure how many people need to be advised not to randomly break mirrors (it seems, at best, a fractured hobby). Some of them likely arise from random collusion of incidents or from historic prejudice.<br />
<br />
Most people think most superstitions are silly - except their own. I know hardened atheists and skeptics who knock on wood. Of course, the "mote in my eye/twig in your own" phenomenon is fairly common among various belief systems, and they are, after all, a kind of extension of the same phenomenon: the need to exert or at least pretend to exert some control over the random events in one's life.<br />
<br />
But there is a more practical, more sinister side to this illusory sense of control.<br />
<br />
How many times have you heard, "It's his own fault he got mugged - he was in <i>that</i> part of town after dark!" or, "Tsk, she would have been fine if she hadn't been wearing a dress like that" or similar statements?<br />
<br />
It's the same thing: this notion that the person (in this case the victim) had control over how the situation played out. While this may be true to an extent, in situations where the person is the victim of a crime, there is only one person who had 100% control over how the events played out:<br />
<br />
The criminal.<br />
<br />
The concept is known in psychology as Defensive Attribution Hypothesis - the notion on the part of the bystander that "if only she/he/they had done (something) different, this wouldn't have happened." The "goal", if one can call it that, is to determine causal mechanisms for the event that the bystander has control over and, therefore, that the bystander can use to prevent the event from happening. It's a psychological defense mechanism: "this can't happen to me because I'm in control."<br />
<br />
The primary problem, though is that if the bystander is in control, then so was the victim (at least theoretically), which means that, according to the bystander, <i>the victim is at fault for being robbed/attacked/(insert crime here)</i>.<br />
<br />
On the surface level, this is ridiculous. I hear, in the virtual echo chamber of the internet, the audience saying, "Well, of course not, but..."<br />
<br />
No "but". No qualification. The only person who has control over - and thus responsibility for - committing a crime is the criminal. One can speak of ways to reduce risk - after one has plainly and unambiguously acknowledged this fact. One can talk of how to defend one's self - after one has admitted that the need for defense is entirely because of the attackers, not the attacked.<br />
<br />
The illusion of control is a very important thing. It helps people to maintain at least a fleeting sense of security in a world where very few things are guaranteed. However, it is <i>an illusion</i>. Try to remember that the next time you wish someone "break a leg" or hear about a woman being raped.Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-2068219912533170221.post-45907700802438288212012-10-02T13:25:00.002-07:002012-10-02T13:25:24.281-07:00This is how science works<br />
http://blogs.scientificamerican.com/scicurious-brain/2012/09/25/ignobel-prize-in-neuroscience-the-dead-salmon-study/<br />
<br />
The basic premise is that a bunch of scientists put a dead salmon in an fMRI and demonstrated brain activity.<br />
<br />
Yes, in a dead, frozen salmon.<br />
<br />
<blockquote class="tr_bq">
So in the final results, the authors compared the normal multiple
comparisons, with the multiple CORRECTED comparisons. When they used the
multiple corrected comparisons, the dead salmon showed nothing. When
they did the multiple comparisons without the correction, the salmon
showed significant increases in “activation”, coincidentally, in the
brain and spinal cord. This shows the importance of correcting for
multiple comparisons and avoiding false positives</blockquote>
<br />
<blockquote class="tr_bq">
<br />
The original poster almost didn’t make it to a conference, but when
it did, it made a major splash, and reactions were very positive. Some
people like to use the salmon study as proof that fMRI is woo, but this
isn’t the case, it’s actually a study to show the importance of
correcting your stats. </blockquote>
<br />
<blockquote class="tr_bq">
And the poster, and the paper that was eventually published, may have
had an effect on the field. The authors note that at the time the
poster was presented, between 25-40% of studies on fMRI being published
were NOT using the corrected comparisons. But by the time this group won
the Ignobel last week, that number had dropped to 10%. And who knows,
it might, in part, be due to a dead fish.</blockquote>
<br />
Yes, that's right, a frozen fish - and the analysis of it - may have improved medical science.<br />
<br />
The takeaway point, though is that you can do your data collection correctly but screw up your analysis and still end up with a bad result.<br />
Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-2068219912533170221.post-37699880838424305022012-08-06T10:15:00.002-07:002012-08-06T10:15:55.454-07:00Or, sometimes, you're looking in<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="http://talkingpointsmemo.com/images/Curiosity-Landing-Aerial.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="424" src="http://talkingpointsmemo.com/images/Curiosity-Landing-Aerial.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Curiosity (inside its sky crane) as it parachutes down to the surface of Mars, as seen from the Odyssey orbiter</td></tr>
</tbody></table>
Off the planet, past the moon, across space, through the atmosphere, under the parachute, released from a sky crane...<br />
<br />
... nothin' but net.Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-2068219912533170221.post-85256891395501226852012-06-04T13:46:00.001-07:002012-06-04T13:46:30.178-07:00StandingSo, you're some unimportant, average person in your city. You know there's some kind of protest or sit-in going on nearby - you walk through the park regularly, and it's been on the news - but you're not participating. You may agree, you may disagree, doesn't really matter.<br><br>
You've been out buying groceries, and on your way back, you decide to cut across the square like you always do. As you cross the street, you see a line of tanks - of your own government's military - heading away where the protestors (mostly students, but almost all citizens) were. Where they'd just killed hundreds if not thousands.<br><br>
<a href="http://www.youtube.com/watch?v=xh78iSg_ZvU">What do you do?</a><br><br>
What kind of a person does it take to go from a casual walk home to standing in front of a moving line of tanks? What went through his mind? We don't know his name, his history, his motivations, or anything - all we know is what was caught on film: a man standing alone on a road.<br><br>
If it happened in your city - if you were walking along and saw a line of tanks heading towards a protest nearby - what would you do? Would it matter who the protestors were? If they were democratic students or religious fundamentalists or just a bunch of people having a bad day? Would you hesitate? Would you keep walking? Or would you stand them down?<br><br>
What kind of person are <i>you</i>?<br><br>
(As a note, I'm explicitly not using certain words and phrases here. The events of that day twenty-three years ago are still heavily censored in that country. It has been written out of history books, is banned from all forms of media, and was the cause of books and such being burned. The people who most need to know about the events that happened - that nation's youth of today - are officially forbidden from learning about it. Think about that a while.)Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-2068219912533170221.post-86159305769169625452012-03-29T09:01:00.000-07:002012-03-29T09:01:40.680-07:00Fighting BackHow many times have you found yourself arguing for a position you didn't really support merely because someone had tried to impose upon you the opposite position?<br />
<br />
Classically, this is the behavior of teenagers - in extreme cases classified as "oppositional defiance disorder" - but even adults experience this kind of inherent oppositional reaction when we feel our independence is being challenged.<br />
<br />
In psychology, this is known as "psychological reactance theory". From <a href="http://www.psych-it.com.au/Psychlopedia/article.asp?id=65">Psycholopedia</a>:<br />
<blockquote>Psychological reactance is an aversive affective reaction in response to regulations or impositions that impinge on freedom and autonomy (Brehm, 1966, 1972, Brehm & Brehm, 1981; Wicklund, 1974). This reaction is especially common when individuals feel obliged to adopt a particular opinion or engage in a specific behavior.<br />
<br />
Specifically, a perceived diminution in freedom ignites an emotional state, called psychological reactance, that elicits behaviors intended to restore this autonomy (Brehm, 1966, 1972, Brehm & Brehm, 1981; Wicklund, 1974). Reactance, for example, often encourages individuals to espouse an opinion that opposes the belief or attitude they were encouraged, or even coerced, to adopt. As a consequence, reactance often augments resistance to persuasion (Brehm & Brehm, 1981). Reactance was proposed to explain many common examples of resistance in society, such as the adverse effects of prohibition.</blockquote><br />
Yes, this is a real, traceable, verified concept: an individual is more likely to be confrontational when s/he perceives that his or her freedoms are being impinged. It doesn't matter why or whether or not they agree with the underlying principles: merely the act of feeling <i>confined</i>, physically or mentally, induces a basic "fight or flight" syndrome that comes out as oppositional defiance.<br />
<br />
Keep this in mind both as you watch your own behavior and as you watch how others respond to you. If you know you are predisposed to a negative response in certain situations, you can have mitigate that response to some degree and prevent yourself from getting into confrontations you don't necessarily want to have. You can also strive to prevent others from feeling the same kind of confinement and, thus, hopefully reduce their tendency to be confrontational in return.<br />
<br />
<i>(Sorry for the long lag between posts; I'll try to get at least one a month up.)</i>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-2068219912533170221.post-56661173712808528672011-09-13T11:50:00.001-07:002011-09-13T11:50:37.708-07:00Once Upon a TimeA raven, flitting around somewhere, notices a silver spot. Being curious, it pecks at it, and a seed falls out of a nearby hole. It pecks again, and another seed falls out. Now, whenever it pecks that silver spot, it expects a seed; whenever it wants a seed, it will peck the silver spot.<br />
<br />
This is learning. Learning is entirely based on predictive pattern analysis: the ability to say, "in the past, every time X has occurred, Y has resulted," and use that information for future prediction. Advanced learning is being able to extrapolate that cause/effect into areas not identical but similar to the original situation: maybe the spot is gold, and maybe it's a piece of candy instead of a seed.<br />
<br />
In simpler terms, the act of learning is an act of building stories - even ones far less interesting than "Jack and the Beanstalk." We build, in our heads, a narrative course of actions that we can use to inform ourselves as well as others in the future. Our brains are very good at building these stories: being good at stories - being good at learning - provided a heritable survival advantage, so our ancestors who were better building stories tended to survive more.<br />
<br />
It's important, however, to recognize when our story-building goes awry: when we don't have enough data or experience to build a realistic cause/effect model, or when we highlight the <i>wrong</i> thing as being the cause or effect. This is <i>also</i> something we tend to be good at, unfortunately, because it's a side effect of constantly looking for stories.<br />
<br />
The most critical part of any story we build is that we be open to modifying, expanding, or removing it based on experience in order to keep it useful: remember, if the story can't predict anything, it's not useful and may even be counter-productive as we waste energy in support of a pattern that doesn't exist. If the raven finds that it's not just silver spots, but <i>square</i> silver spots, that result in seeds, continually pecking a <i>round</i> silver spot and expecting a seed is a waste of energy; if it goes on long enough, the raven may starve.<br />
<br />
Stories are extremely powerful, and while they can be useful, we need to constantly check and validate them to prevent them from becoming detrimental, both to ourselves and to society.Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-2068219912533170221.post-88474902217062898112011-07-06T21:57:00.000-07:002011-07-06T21:57:02.449-07:00Privilegeprivi: private<br />
lege: law<br />
<br />
Privilege means, literally, "private law". Operating from a position of privilege means that the rules are applied differently - or are just not applied at all - for you.<br />
<br />
There are many kinds of privilege. Some are earned, at least partially. Many are gained through luck or circumstance. Some are assumed even when not held.<br />
<br />
The key, though, is that in all cases of privilege, the privileged person is acting from a position of strength or advantage over others who aren't privileged. The person with privilege doesn't have to be doing this deliberately or maliciously, or even be aware that s/he is doing it at all; often, one advantage of privilege is <i>being unaware that a state of privilege even exists:</i> those who have privilege may not be aware of it, while those who do not have it generally don't have a choice but to be aware of it. Furthermore, exercising a privilege generally (though not always) involves detriment to someone not privileged.<br />
<br />
There are many privileges that are granted through general culture, so something that provides privilege in one culture may not (or may deny it) in another.<br />
<br />
An example of a privilege that most people wouldn't think about: my parents are both great at finances, and have taught me from a young age about them. That's a privilege - it's an advantage most kids don't have. I didn't "earn" it in any way, and in this case my having it doesn't necessarily detract from others. But the knowledge I have gained because of my parents' fiscal ability is something that most of the people I know don't have.<br />
<br />
An example of a privilege that is easy to spot is that I'm male. Being a male, even (or especially) in modern US culture, provides a plethora of advantages that most men aren't really aware of. We all know about things like wage differences, but even things like "not having to be vigilant about rape prevention" are privileges that men have and women don't.<br />
<br />
An example of a privilege I lack is that I'm not straight. This goes far beyond issues like same-sex marriage and discrimination at work: the assumption of heterosexuality is so prevalent in culture that there is quite often a feeling that I have to hide my sexuality in everyday conversation or make other people angry/upset/flustered. Straight people don't have that stress: a straight man talking about his wife/lover/girlfriend does not cause any kind of disruption, but a gay man talking about his boyfriend/husband or a lesbian talking about her girlfriend/wife <i>does</i>. Thus, straight individuals are operating from a position of privilege.<br />
<br />
We often can't avoid privilege - I can't stop being a guy, or at least not in ways that anyone would find reasonable, and I can't change who my parents are - but we can do our best to make sure that, in the exercising of privilege, we do not hurt or disadvantage others who are not privileged. This is often easier done that it seems, but first we have to be aware of the advantages we have that others lack. Also, we have to realize that, sometimes, levelling the field again isn't enough: to counteract privilege we, the privileged, must sometimes go out of our way to actively disadvantage ourselves in order to help someone who is unprivileged - especially if that privilege is indemic to culture and widespread.Unknownnoreply@blogger.com2tag:blogger.com,1999:blog-2068219912533170221.post-26353099792509392132011-07-03T13:59:00.000-07:002011-07-03T13:59:24.866-07:00So close... and yet, so far<div class="separator" style="clear: both; text-align: center;"><a href="http://www.flickr.com/photos/obenchainr/5898489734/" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" i$="true" src="http://farm6.static.flickr.com/5306/5898489734_5d5aca413a.jpg" width="212" /></a></div>Always remember: we're looking out, not up.Unknownnoreply@blogger.com4tag:blogger.com,1999:blog-2068219912533170221.post-24153208783874152332011-04-12T08:21:00.000-07:002011-04-12T08:21:28.035-07:00Once around the park50 years ago today, a man orbited the Earth for 108 minutes. His primary concern, before launch, was making sure he had enough sausage to last him the trip.<br />
<br />
The fact that the man was a Russian, and that, at the time, the US and the USSR were engaged in a bitter cold war, has no bearing on the significance of the event. As Michael Collins said of the moon landing some 8 years later, the accomplishment transcends borders and nationalities: "Everywhere we went, instead of saying 'you did it, you Americans did it,' they were saying 'we did it, we, the human race, did it,' and I thought that was a wonderful thing."<br />
<br />
Today, space programs are international, with almost all the major players cooperating on a world stage. Even as private space flight begins to take off (pun intended), space travel remains one of the few egalitarian concepts in a world rocked by conflict.<br />
<br />
The US has led the world in accomplishments in space since landing on the moon. We should do everything we can to encourage our population and scientists to continue exporing this next great frontier. At the same time, though, we must remember that we weren't the first ones there: we owe much of our drive and inspiration to Yuri.Unknownnoreply@blogger.com3tag:blogger.com,1999:blog-2068219912533170221.post-57151424410153137722010-12-31T18:57:00.001-08:002010-12-31T18:57:28.848-08:005%We humans are silly, irrational creatures. Neurologically, we make decisions based on bad or misleading assumptions, impractical expectations, and often raw, unfiltered emotion. Psychologically, we then try post-hoc justification of what we've already decided, often allowing ourselves to jump through ridiculous logical hoops to try and seem reasonable. If that fails, we might even abandon pretense at rationality and simply appeal to unknown or unknowable forces that don't have to obey the rigid laws of reality.<br />
<br />
By all rights, we ought to still be living in caves and banging rocks together.<br />
<br />
And yet, we're not. We live in sky scrapers and mansions, apartments and houses. We travel around our planet at hundreds of miles an hour, or off of it at thousands. We create languages that lead to novels and poetry, instruments that produce punk rock and symphonies, artworks that inspire great emotions.<br />
<br />
Of course, we've also created weapons of mass distruction. We've severely imbalanced if not outright destroyed entire ecologies. We've perfected genocide, popularized prejudice, and fought for thousands of years in the names of so-called benevolent deities.<br />
<br />
95% of what we do as a species is irrational to say the least. It's that last 5%, though, that must give us pause. That last 5% gives us science, and reason, and medicine, and technology... all the things we associate with progress.<br />
<br />
And while hope is silly, irrational, a product of the first 95%, it is hope that drives us again and again to thinking that maybe, just maybe, that last 5% can, in the end, make up for all the rest.<br />
<br />
Which brings us to tonight, and my own personal moment of irrationality: here's hoping that, in the coning year, every one of you finds enough benefit from that last 5% to make the other 95 worthwhile.<br />
<br />
Happy New Year.<br />
<br />
--AustinUnknownnoreply@blogger.com5tag:blogger.com,1999:blog-2068219912533170221.post-66757725656823833822010-11-23T09:43:00.000-08:002014-08-01T21:51:53.441-07:00TraditionIt is the method by which we perpetuate culture, passing on beliefs, ideas, or customs from one generation to the next. Traditions form the basis of cultural identity, the mortar with which a family, a city, a country builds its personality. They are often unassailable bastions at the core of history for a people.<br />
<br />
And that's why they're dangerous.<br />
<br />
Traditions <i>are</i> often immune to challenge, merely by virtue of being "traditions": doing something "because that's the way we've always done it", whether it's who carves the turkey, how we set up the Christmas decorations, or how we venerate people in the past. And while culture is often an excuse, "the past" is really the key point here, because tradition looks only to the past and only with longing in its eyes.<br />
<br />
The hidden assumption in tradition is this: the way things were done before is the best way and will always be the best way going forward. That assumption is quite often wrong. If a behavior is, in fact, the best method, then it should be able to stand on its own without being labeled "traditional"; if it is not, then no amount of reverence for history should prevent the best choice from being made. "Tradition" implies that a bunch of desert nomads from 2000 years ago could know what's best for people living in hundred-story high rises. It implies that a handful of rich white men from a few hundred years ago knew what was best for computer gamers and gun owners today. It argues that, because something was done in a specific way at some point in the past, that way is now <i>a priori</i> the best way and should never change.<br />
<br />
As Toscanini said, "Tradition is just the last bad performance."<br />
<br />
There are reasons to value cultural, historical or social traditions <i>in the context of history</i>, such as teaching traditional art forms, languages, or even rites. But those traditions should never be divorced from the time period in which they arose or their own cultural histories: why things were done a certain way, what the justification was, the effects of that tradition on future history, etc. Traditions should be valued as artifacts in the same way we value physical artifacts: as curiosities and points of reference, not as relevant for today's world. We can appreciate the role of ceremonial dance and religion in the political and social framework of pre-modern cultures without feeling the need to revere them, the same way we can appreciate an old flint knife without feeling the need to give up stainless steel or carbonite.<br />
<br />
History shouldn't be ignored, but neither should it be placed on a pedestal and worshipped. Good practice and good information can stand on its own without arbitrary enforcement: if something is <i>useful</i>, it is useful whether it is old or new. We study Aristotle, Euclid, Newton and Freud not because of tradition but because of the inherent value of their statements and ideas, even if we've proven them "wrong" (or at least restrictive) in the intervening years: they are stepping stones along the path to modern logic, mathematics, physics and psychology. Just as we can appreciate Bach and Mozart without limiting ourselves to Baroque music and abandoning the Romantic period, we can appreciate the trappings of history without feeling the need to continue them and, instead, improve or even abandon them as need arises.<br />
<br />
As we approach a holiday season that is often draped in traditions, don't be afraid to abandon them. If you find yourself sad or depressed, it may be because you're trying to cling to an outmoded idea of "should" that no longer fits - a tradition that is no longer useful. Embrace whatever celebrations or routines that you find useful, even if they aren't "traditional": go out instead of eating in, invite close friends to dinner instead of family, skip the presents and donate to charity or spend time at a soup kitchen - whatever it is that you feel is a better fit for you and your life today, not what you've been taught "should" be done.<br />
<br />
Tradition looks to the past, but, while we must remember what has come before so as not to repeat the same mistakes, the past is not a model by which we can live. Progress exists only in the future, and with it comes the most powerful force we know - the thing that can forever destroy the historical, repetitive, <i>traditional</i> grievances of life:<br />
<br />
Hope.Unknownnoreply@blogger.com2tag:blogger.com,1999:blog-2068219912533170221.post-37759966349475120592010-11-16T11:22:00.000-08:002010-11-16T11:22:27.009-08:00Seeing isn't believing"Observation" isn't a direct action.<br />
<br />
We see things because light bounces off of them and reflects into our eyes (biological processes aside). We often think of this as "seeing the object", but the key here is that light itself is a thing, and it's really the light you're seeing: you have no direct interaction with the object. You only interact with it through the intermediation of light.<br />
<br />
All observation takes place like this: whether it's photons (light), electrons, or anything else, the only way to "observe" something remotely is to bounce things off of it. Most of the time, this doesn't matter much: photons bouncing off of "normal"-sized objects don't do much to the object. When we're talking about atomic or subatomic levels, however, the relative sizes between the thing we're trying to observe and the particle we're using to "see" it are much closer. As such, "bouncing" anything off of them knocks them around, similar to pool balls on a table.<br />
<br />
The result, as Werner Heisenberg determined, is that we can only take one measurement - one observation - of a particle at any point, because as soon as you take one (usually position or velocity), you change the particle's other properties. This has been formally labelled "Heisenberg's Uncertainty Principle", and it forever broke down the myth of the "impartial observer" - someone (usually a scientist) who could sit back and "just watch" without interfering with a process. <br />
<br />
Heisenberg's was the first blow to the notion of the independent observer, but it wasn't the last. Quantum theory, as it has developed, has focused more and more on particles themselves existing in what we identify today as some kind of probability wave. In short, if you shoot an electron out of a tube pointed in a given direction, you can have a general idea about where the electron may be at any given moment but you can't be sure. That's not surprising in itself for most people; the part that is hard to understand is that for observed results to be correct, saying the electron is actually <i>anywhere</i> doesn't work. The uncertainty in its location isn't just a factor of us not knowing - it seems to be intrinsic to the particle itself.<br />
<br />
Here's the classical experiment. We have an electron gun pointed at a wall with two slits in it; logically, the electron can only travel through one or the other slit. However, if we place a piece of electron-sensitive paper on the other side of the slits to see where the electron ended up, we get something odd: the pattern that results can only occur if the electron takes <i>both</i> possible paths, not just one or the other. This is true even if the electrons are fired one at a time with enough time for the first to hit before the second is released.<br />
<br />
Let me restate that, because it's possible to miss the implication: in order to account for what we actually see happen, we have to assume that part of the electron - something indivisible under normal circumstances - passes through each slit. The notion that we can't know its position isn't just a factor of our not having the precision to determine it: according to results, the particle <i>has no definite position</i> until it reaches the paper and, it seems, has to take all potential paths to get there. Our best understanding of this is that it exists only as a <i>probability</i>, and that somehow those potential probabilities are what interact and cause the observed pattern.<br />
<br />
It gets weirder: if we set up a system to watch the electron as it passes through the slits so that we can <i>observe</i> which one it goes through, the odd pattern requiring the probability explanation disappears; the result is explanable with classical physics models. So, it appears that the electron behaves like a single particle once we observe it, but until we do, it behaves like a set of probabilities. Furthermore, this is true not just of electrons but of every particle the process has been tested with: as near as we can tell, <i>all</i> matter exists probabilistally until observed and classically after observation.<br />
<br />
Now, in all likelihood, what's occurring here is probably <i>not</i> and "actual" change in the nature of the particle but some kind of observational distortion similar to Heisenberg's Uncertainty; that being said, scientists have tried to disprove this and other weird aspects about quantum theory for years to no avail. Philosophers and religious types, of course, have taken this "elevation of the observer" to dizzying heights [pardon the pun] in an attempt to "scientifically" validate free will, all manners of Gods, or whatever unprovable phenomenon they prefer. The use of probability as a model for understanding quantum theory also led to Einstein's famous admonition that God "does not play dice with the universe"; unfortunate for the good professor, observational evidence seems to disagree.<br />
<br />
Even if we discount the philosophical extensions, the increased awareness of observation and its participation in the process can be very useful. In "real world" terms, there are far more practical impacts to observation than simply photons kareening around like billiard balls: namely, the fact that everything we observe not only has to pass through intermediary objects and multiple analog processing systems, but that it then has to be interpreted by us. Bringing into focus how our biology and psychology can affect our observations (and, thus, our interactions with and even thoughts about everything around us) is the focus of <i>general semantics</i>, but that's for another day.<br />
<br />
For now, just try to keep in mind that you aren't really reacting to the world - you're reacting to reactions to the world, and even then through a layer of interpretation.Unknownnoreply@blogger.com1tag:blogger.com,1999:blog-2068219912533170221.post-68083225236000863422010-11-02T11:52:00.000-07:002010-11-02T11:52:09.652-07:00WordsWritten Chinese is a pictographic language: the symbols are representational, with each depicting a full word or concept. For example, the word for "island" was, originally, a picture of a mountain in the water where birds land; the current version has been "blurred" by the use of brush instead of stylus, but you can still make out some of the concept.<br />
<br />
What's important here is that, with pictures, you can't really denote tense - the desgination of whether something already happened, is happening, or will happen in the future. As such, written - and even spoken - Chinese has no real tense for any verb. If you want to say you went to the doctor yesterday, you quite literally say, "I go to the doctor yesterday."<br />
<br />
Chinese is of course not the only language that has this issue - and English has plenty of problems on its own. While these differences go a long ways towards explaining the odd syntax foreigners use in non-native languages, there are other implications.<br />
<br />
Scientific studies have shown that, if your language doesn't have a word or concept for something, it is much more difficult - if not downright impossible - for you to <i>think</i> in terms of that thing. Colors and musical scales are easy examples: anyone used to "western" music will have a very hard time differentiating all the notes in Indian music. People joke that Innuit have 50 words for "snow", but what they really have is 50 different things they can identify which, to us, are all simply "snow": we lack an awareness of differentiation that they can make.<br />
<br />
Most people I know look at a block of code and simply say, "Oh, it's a program." However, I differentiate between: markup, interpreted, and complied code; imperative, procedural, object-oriented, and functional languages; etc. I have <i>words</i> for all of these, and thus I also have concepts behind the words.<br />
<br />
And, having the word and the concept influences other aspects. Philosophies in China (and similar Asian culturs) tend towards timelessness, reincarnation, and "balance" between action and non-action - all of which makes sense when we consider that their language doesn't easily differentiate between past, present, and future. If I can't describe the differences between tenses, won't my phrasing - and eventually my thought pattern - reflect that kind of timelessness?<br />
<br />
I've heard one definition of "genius" which states it as the ability to conceive of something that one doesn't have the language to describe. Many of us have invented words at times, trying to differentiate something that heretofor hasn't been differentiated. Sometimes it's a matter of nuance, and sometimes it's simply a concept that doesn't exist in our native language. A recent addition to the English language (beyond mere technical or "slang" terms) is the concept of a "meme", defined by Richard Dawkins. <br />
<br />
This is probably the best argument for studying multiple languages - and not just ones with a common root. If you speak English, learning German - or even a romance language - doesn't give you nearly as much as learning Greek, Chinese, or Russian. At the same time, English has become the international language not just because of the power of English-speaking countries like the US but because of its amazing breadth: English has adopted/stolen/incorporated more words and concepts from other languages than any other language in existence. It also has more tenses than any other, and so one can make more precise statements in English than in any other language - a major benefit in technical and scientific fields.<br />
<br />
And, yet, any language still has its limits. My favorite example of a concept which does not exist in English is that of the Greek <i>arete</i>: it's been loosely translated as "virtue" or "purpose", but "essence of being" would also be close. None are exact, though; it's a concept which only exists in abstract association in English, and yet was so intrinsic to Greek culture that it is the reason for the Olympics.<br />
<br />
The important factor to remember is this: even the language you speak can frame the concepts of which you can think. It's important to try to work beyond linguistics to more abstract thoughts that <i>cannot</i> be easily converted into language. Who knows - you might even get to make up a new word.Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-2068219912533170221.post-53075624846651099182010-10-13T14:40:00.000-07:002010-10-13T14:40:25.268-07:00Just the factsWe live in a world that follows rules. We don't know what most of those rules are; some of them, we understand a bit and can approximate to decent levels. Not knowing the rules, however, doesn't change the fact that the rules exist and are pretty consistent.<br />
<br />
The more you know about the rules and about the world around you, the better off you'll be in trying to maneuver through life. To this end, the most important thing in life is the truth. If you don't have facts and real information, any decision you make is automatically flawed. Deteriming what is real - what is truth - is the most important act you will ever undertake.<br />
<br />
What's more, you'll have to make that determination every moment of every day of your life. As I said, we only know approximations of some of the rules; the rest are either completely unknown or guesses at best. Sometimes we don't even know that a rule exists. So, every time you hear a statement or learn a fact, you need to be able to make a determination on the truth of that statement or fact.<br />
<br />
The most obvious way to verify a statement is to see if it's consistent with what you can observe in the world around you. If someone tells you the sky is pink but you look up and see it's blue, you've just falsified that information. If someone tells you that "doing x will cause y", and you do "x" repeatedly but "y" doesn't happen, you've falsified that statement; if "y" *does* happen, you know that the statement has truth in it.<br />
<br />
While it would be great to say that something is either true or false, we don't have a way to do so most of the time. The best we can sometimes do is approximations - deteriming whether something is likely true or likely false rather than absolutely true or false. Some statements are easily proven true but hard to falsify; for others, the reverse is true and it's easy to falisfy but hard to prove. There are also statements which can't be proven either way - sometimes because we don't have the ability right now, sometimes because we'll never have the ability.<br />
<br />
The most important thing in any life is truth, and the most important skill you can hone is learning how to identify level of truth in a statement. Observable reality is the only measuring stick of value. If something doesn't match up with observable reality, it simply isn't true.Unknownnoreply@blogger.com2tag:blogger.com,1999:blog-2068219912533170221.post-16218186384974803642010-10-01T14:22:00.000-07:002010-10-01T14:22:11.243-07:00The map is not the territory.Most people have no clue what that phrase means, but it's very important.<br />
<br />
In a park near you, there is likely a tree of some sort. It doesn't matter if it's tall or short, evergreen or perennial, flowering or not. The tree exists.<br />
<br />
You have an image of the tree, a concept of it, a description. You can reference it in conversations with others, or in poetry if the fancy takes you, because of this. But, and here is the tricky part, <i>the concept you have of the tree is not the same thing as the tree itself.</i><br />
<br />
The tree exists as a thing, whether you think of it or not, whether you name it or not, whether you comprehend it or not. The thing itself, the existence that is true even if there was nothing to observe it, is called a nuomenon. Your idea of the thing, your descriptions and memories and such, are called the phenomenon (which is "the thing as perceived"). The two are very distinct.<br />
<br />
A single nuomenon can have multiple phenomena - in fact, there's at least one for every observer. Each person, animal, insect, or even bacteria experiences the tree separately and thus has a different, unique way of describing it. These phenomena can be related, shared between people (or other creatures) with words or sounds or scents.<br />
<br />
The thing itself, though - the nuomenon - is singular. It exists independent of (and in fact outside) observation, and it cannot be related or experienced. In a sense, the phenomenon is the map of your neighborhood, while the nuomenon is the actual neighborhood itself. The map is useful, but it can't in any way represent the full reality of the neighborhood: the smells, the sights, the friendliness or animosity of neighbors, etc.<br />
<br />
The point is, the two are separate. The description or experience of something, while useful, is <i>not</i> the thing itself. The map is <i>not</i> the territory.<br />
<br />
The problem arises when people begin to think that the map <i>is</i> the territory: when they start to mentally think of their description of something as being equivalent to the thing itself. One very common but unfortunate example of this is stereotyping: someone recognizes a prominant characteristic of a group of people and then uses that characteristic (and usually whatever negative traits they associate with it) as the full description of anyone in the group. Individuals are, obviously, individuals, and usually differ drastically in many ways, but to the stereotyper, their description (the map) has become synonymous with the individual (the territory).<br />
<br />
Other examples:<br />
<ul><li> In organizations, the tendancy over time is to being to treat the company like an org chart or process description with people merely filling in roles. This generally causes tension and discord, since organizations are living, fluid, dymanic systems.</li>
<li>In science, many scientists who have worked with certain theories over time begin to think like the theory is reality, rather than just being a description of reality; this often leads to drama when data arises that doesn't fit the theory.</li>
<li>In schools, grades have become the reason for the classes, rather than a method for demonstrating understanding of the content of the classes. A high test score is not equivalent to comprehension of the content.</li>
</ul><br />
Sometimes, people try to confuse map and territory deliberately. A good example of this is the "Tea Party", which is trying to get people to see the entire party as only one aspect of the movement (the anti-government portion) and completely miss the full spectrum of what the movement is actually doing/vowing to do (to be fair, most political parties do this).<br />
<br />
<div>Maps should be dynamic: they change as new information becomes available, just like Google updating their satellite photos. Too often, mistaking the map for the territory leads to undue dependency or emotional ties to a specific map. That makes it harder (if not impossible) for the individual to adapt as reality changes, and makes any decisions made based on the faulty map automatically suspect.</div><br />
If you ever find yourself frustrated that something isn't behaving "as it should", this is almost always a sign that your map is faulty and needs to be adjusted: either your understanding is flawed or your information is incomplete (or both). The nuomenon is perfect (in the sense that it is entirely self-consistent), so any inconsistency has to be in the phenomenon (your understanding/experience of it).<br />
<br />
Ignoring the realities of the territory and simply assuming the map is correct can lead you to walk off of cliffs, metaphorical or otherwise.Unknownnoreply@blogger.com1tag:blogger.com,1999:blog-2068219912533170221.post-26403517298050374612010-09-20T11:36:00.000-07:002010-09-20T11:36:46.031-07:00|2b|"It’s like doing a jigsaw puzzle where you can change the shape of the pieces as well as their positions."<br />
<br />
This is one man's description of what programming is like from the perspective of the programmer. From a comprehensibility standpoint, I think it's pretty good.<br />
<br />
Programming, as a function, is something that mystifies the vast majority of the population; as a programmer, I think it's safe to say that even the majority of the people paid to program have no real idea what they're doing on the conceptual level. If you ask a bunch of programmers "what do programmers do?", you'll likely hear a range of replies from the uber-technical ("transliterate a requirement") to the basic ("implement a process") - and that's just on the actual work performed. If you ask a bunch of programmers *how* they do what they do, you likely won't get any really substantive answer at all.<br />
<br />
It "should" be simple, really. All programming languages consist of a basic series of operations and relationships; such series are notably finite and often fairly short, when you actually look at the principles - offhand, I'd say there are less than 200 specific relators covering all programming languages (though syntax often varies wildly), and probably less than 20 if we are willing to use abstraction. All we're really doing when we program is combining various objects/concepts/"things" using these relators.<br />
<br />
Of course, "all" a painter does is put pigment on canvas, and pianists only have 72 or so notes with which to play. The complexities of painting and composition come not from the tools or options but in how those tools and options are utilized. There's a creative step, an inuitive moment or leap that takes place. This leap generally takes things we understand or comprehend and presents them in ways we do not expect and is usually founded in describing the relationship between concepts that heretofor have not been understood as related. The photographer captures profound emotion in a simple flower; the painter portrays the subject from a prespective never before seen; the musician writes a motif that recalls a spring rain.<br />
<br />
The fundamental principle here is an understanding - albeit usually subconsciously - of relationships. To quote Lewis Carroll, why is a raven like a writing desk? The first act of creation is in establishing the existence of the unknown - the fact that something is waiting to be created. Usually, this understanding then provides the context for creation - in essence, understanding what is missing gives us the first step in understanding what it is we want to create. We then have to frame that desire in the context of our medium - the semanitcs of creation are different for photography, music, painting and poetry.<br />
<br />
We can formalize these steps as lack (or problem), desire, and semantics. What we usually think of as the "creative step" in the series is "desire": the conceptualization of what it is we're trying to achieve in some kind of workable or formal structure. This is the "hard part", as it requires both clarification of the problem as well as realizing the limits of the semantics even while we work separately from either.<br />
<br />
And this is the step in programming that most people don't understand. Semantics, in programming, is merely the syntax of the language - something even text editors can manage, so obviously not a major issue intellectually. The "problem" is usually input from a client or customer, though oftentimes such individuals need help clarifying exactly what it is that is wrong; still, this is a piece that is easily managed through standard processes. It is the clarification and implementation of desire that requires abstraction and the mental jigsaw puzzle: figuring out how to fulfill the needs presented in the desire under the limitations of the semantics, when there are quite literally infinite potential solutions but only a few practical ones.<br />
<br />
Programming is, for all its technical requirements, a creative act not terribly different from any other artistic pasttime. Just like other arts, the moment of inspiration in programming cannot be laid out in a procedure or taught in a classroom. One can teach someone to use a camera, but one cannot teach someone to take inspiring photographs; one can teach someone English, but one cannot teach them to write beautiful poetry. One can teach programming languages, syntax, and grammar, but one cannot teach effective programming.<br />
<br />
Obviously, the better a person is a coder (or photographer, or linguist, or...), the wider the range of tools and options that are available to that person while programming (or photographing, or writing, or...) However, as artforms, all these rely on moments of inspiration. One can encourage the mindset needed to have such moments, and train oneself (or others) in the sorts of abstract daydreaming that forms the basis for them, but one cannot explicitly teach inspiration itself. It comes naturally or not at all.Unknownnoreply@blogger.com2tag:blogger.com,1999:blog-2068219912533170221.post-13112386772124952712010-06-27T14:52:00.000-07:002010-06-27T14:52:41.920-07:00Sometimes other people say it best<blockquote>I am the outcome of a trillion coalescing possibilities...</blockquote>That's a quote from PZ Myers; go read <a href="http://scienceblogs.com/pharyngula/2010/06/sunday_sacrilege_so_alone.php">the whole thing</a>.Unknownnoreply@blogger.com10tag:blogger.com,1999:blog-2068219912533170221.post-19071838394647419712010-06-03T08:08:00.000-07:002010-06-03T08:08:51.072-07:00Apples and protonsSo here's the deal. This apple, the one we can't decide whether it's real or not, isn't just an apple. It's a proton - or electron, or neutron, or really any particle of matter in the universe.<br />
<br />
Most people imagine particles as little tiny marbles - mainly because this is how they're presented in chemistry and physics classes. That's the old model, though, and even the "new" model - developed with the rise of quantum physics - is becoming more and more suspect.<br />
<br />
You see, just like our first apple, particles don't really exist. They're not physically "there" with any exact properties we can determine. Everything we know about them we know indirectly by how they interact with other matter (of course), but since we don't know that the other matter exists either, we're in a bit of a conundrum.<br />
<br />
We <em>do</em> know that the idea of little marbles doesn't work for a variety of reasons. Again, that's not new. One basic flaw is with the electron, which doesn't exist like a moon orbiting a planet but more like a shell of potentiation around the nucleus (and no, there's not really a simpler way of saying it). What we didn't know then that is rapidly being suspected now is that the nucleus is the same: a mass of potential rather than bunch of marbles stuck together. When we "knock a neutron out", it's more like the properties of a neutron being separated out of the miasma rather than a billiard-ball action.<br />
<br />
It gets "worse", though. Even a separate, distinct free-floating particle like a neutron isn't "whole" - it's made up of smaller particles, or <em>seems to be</em>. However, these <em>particles</em> - such as quarks - don't seem to be able to exist by themselves for any practical length of time. So, whether they really exist independently or are merely the "shards" of an "exploding" particle, we don't know.<br />
<br />
And then we get into the matter/energy conundrum: how can two things with no common physicality be interchangeable? The answer, of course, is that they have to have some common physicality - some mechanism or particle or <em>something</em> - that is matter when grouped like <em>this</em> but energy when grouped like <em>that</em>. Such is part of the search for the mystical Theory of Everything, which is looking to be more and more mythical as time goes on.<br />
<br />
So, we don't know what anything is, we can't fundamentally describe anything, and we're not even sure we'll ever be able to. Yet, matter exists (or seems to). How?<br />
<br />
This is our second apple - the average, the concept, or in physics terms, the superposition of the combined wave functions. Detectable matter, to the best of our perception, only exists as an average - even on subatomic scales. We can categorize things loosely, describe fuzzy margins with inherent leeway, but we can never state exactly what something is or isn't. When we look for something, it <em>seems</em> to be there, but if we study its effects and and try to pinpoint it, we can't be sure.<br />
<br />
Luckily, "fuzzy margins" are good enough to a lot of things - in fact for pretty much any applied science, such as engineering or nuclear physics. However, the fuzziness places an accuracy limit on what we can know at this point. Unless we can find some more fundamental principle that can eliminate the fuzziness, we're rapidly coming to a wall in our understanding of basic existence.<br />
<br />
So, the next time you sit in a chair, try to understand that the only reason you don't fall to the floor is that the average of your existence can interact with the average of its existence. And the next time you eat an apple, try to picture it as a superposition of potentials, not just a piece of ripe fruit.Unknownnoreply@blogger.com5tag:blogger.com,1999:blog-2068219912533170221.post-55819581668122450652010-06-02T11:15:00.000-07:002010-06-02T11:15:33.460-07:00Every apple is a real apple.Take every apple that ever existed, whether real or imaginary. Combine them all into a single structure - not exactly by averaging, but by overlaying the different pieces of information together to get a cohesive whole. The result would be something that, basically, averages out to an apple in theory but doesn't necessarily resemble a physical apple.<br />
<br />
That's okay, though, because we're not after a physical apple. We're after a potential apple - in essence, a kind of mathematical function that defines the limits within which "apple" exists. The limits are fuzzy, just like the edges of our structure: some characteristics don't start and stop so much as fade, and fractal math states that such edges must be infinite in gradiation. In linguistic sense, you're left with a bunch of traits that are "apple", a bunch that are "apple-ish", and some that "tend to be apple" at this end but gradually become less apple-y at the other.<br />
<br />
Now, let's use that structure - that equation or description, however you want to think of it - as our definition of "apple". Suddenly, every apple is a real apple, because we're not looking for specifics so much as trends or patterns.<br />
<br />
This is hard for a lot of people to grasp: we've traded something hard and explicit for something vague, but in doing so end up with a better definition and, in actual fact, a better understanding of that to which we're referring. Whereas before nothing could be said to be an apple (and thus the definition useless), we can now define with reasonable precision what is or isn't an apple and use that data for further explorations (such as defining what is or isn't an apple pie).Unknownnoreply@blogger.com3