I've seen enough movies to notice a pattern.
Person A: "I have something to tell you!"
Person B: "I have something to tell you, too!"
Person A: "Well you go first!"
Person B says something disappointing to Person A that causes Person A not to say what he was going to say.
Or the scenario might look more like this:
Person A: "I have something to tell you!"
Person B: "Okay, but first let me tell you something!"
Then the same thing happens. The whole time we're sitting there thinking, "If only Person A had been allowed to go first! This whole thing could've been resolved! Augh! You idiots!"
So I've decided that if you've got something to say somebody, you should just say it. Or if they tell you they have something to say to you, then you should just listen. Otherwise, you could postpone the resolution to your drama for another hour or more.
Wednesday, March 29, 2006
Friday, March 17, 2006
How to make a bamboo backed ipe longbow
I haven't been motivated by anything but laziness lately, which I why I haven't added anything, but I just finished doing another build along and thought y'all might not like to have a look see, but I'd post it anyway.
Here is the build along.
Here is the build along.
Friday, March 03, 2006
The biases and motives of moral realists and non-realists
I was thinking of an example of how our biases affect how we attribute motives to others. Here are two opposing sides.
There are some people who say there are objective moral values. There are other people who say there are not objective moral values. It is interesting to notice how each side attributes motives to the other side.
[Since I brought it up, lemme make a detour here. It’s an informal logical fallacy to try to refute a position by pointing out sinister motives in the people who hold that position. That’s a form of the ad hominem fallacy, and it’s a fallacy because it suffers from irrelevance. Our motives for holding a belief have nothing to do with whether or not those beliefs are true.]
Moral non-realists will often say the reason people hold to objective standards of good and evil is because they have a sinister desire to control everybody. They figure since morals are relative, people shouldn’t impose their own personal values on other people. (Never mind the inconsistency in this position; it’s not my point.) They figure if people are trying to impose their values on everybody else, then they just have an unhealthy need to suppress, control, and manipulate other people.
Moral realists will often say the reason people deny morality is because they want to justify their own actions. Rather than submitting to objective standards of good and evil they know are true, they pretend they aren’t real. It allows them to indulge in their guilty pleasures without the guilt. A guilty pleasure without the guilt is just pleasure. (Never mind the inconsistency in this position, too; it's not my point either.) Moral realists figure since everybody knows deep down inside that there’s a difference between right and wrong, there must be some sinister motive for being in denial.
As I write the above, I must admit that my own bias is in full swing. I have heard both of the above accusations, and I find myself agreeing with the moral realists and disagreeing with the moral non-realists. And, surprise, I’m a moral realist!
What advantage is there in worrying about somebody's motives anyway? Shouldn't we be more concerned about whether or not their position is true than in whatever motivates them to embrace it? I guess that depends. If we're trying to discover the truth of the issue, then their motives are irrelevent. But if we're trying to reason with them, and reason has nothing to do with why they hold their position, then perhaps there is some advantage to exploring what their motives might be. By getting a person to be honest with themselves about their motives, maybe they will be more open to reason. It does no good, of course, to try to convince a person that they have motives when they really don't. Attributing motives falsely to people is a good way to discredit yourself with them. It also puts them on the defensive, and people never listen once they're on the defensive.
There are some people who say there are objective moral values. There are other people who say there are not objective moral values. It is interesting to notice how each side attributes motives to the other side.
[Since I brought it up, lemme make a detour here. It’s an informal logical fallacy to try to refute a position by pointing out sinister motives in the people who hold that position. That’s a form of the ad hominem fallacy, and it’s a fallacy because it suffers from irrelevance. Our motives for holding a belief have nothing to do with whether or not those beliefs are true.]
Moral non-realists will often say the reason people hold to objective standards of good and evil is because they have a sinister desire to control everybody. They figure since morals are relative, people shouldn’t impose their own personal values on other people. (Never mind the inconsistency in this position; it’s not my point.) They figure if people are trying to impose their values on everybody else, then they just have an unhealthy need to suppress, control, and manipulate other people.
Moral realists will often say the reason people deny morality is because they want to justify their own actions. Rather than submitting to objective standards of good and evil they know are true, they pretend they aren’t real. It allows them to indulge in their guilty pleasures without the guilt. A guilty pleasure without the guilt is just pleasure. (Never mind the inconsistency in this position, too; it's not my point either.) Moral realists figure since everybody knows deep down inside that there’s a difference between right and wrong, there must be some sinister motive for being in denial.
As I write the above, I must admit that my own bias is in full swing. I have heard both of the above accusations, and I find myself agreeing with the moral realists and disagreeing with the moral non-realists. And, surprise, I’m a moral realist!
What advantage is there in worrying about somebody's motives anyway? Shouldn't we be more concerned about whether or not their position is true than in whatever motivates them to embrace it? I guess that depends. If we're trying to discover the truth of the issue, then their motives are irrelevent. But if we're trying to reason with them, and reason has nothing to do with why they hold their position, then perhaps there is some advantage to exploring what their motives might be. By getting a person to be honest with themselves about their motives, maybe they will be more open to reason. It does no good, of course, to try to convince a person that they have motives when they really don't. Attributing motives falsely to people is a good way to discredit yourself with them. It also puts them on the defensive, and people never listen once they're on the defensive.
Thursday, March 02, 2006
How our biases skew our conclusions about the motives of other people
Earlier tonight I read Dagoods' most recent blog entry where he said he thinks Christians avoid learning about their opposition because they are afraid. Now I don't deny that this is the case for a lot of Christians, but the lack of balance in his post got me to thinking about something. We all seem to more readily attribute sinister motives in those who are against us than in those who are for us. Seems only natural, doesn't it?
Dale Carnegie in How to Win Friends and Influence People made the same point. I don't have the book with me, but he argued in one of the first few chapters that people rarely ever blame themselves for anything. They always find some way to justify their actions. C.S. Lewis made the same observation in one of the first few chapters of Mere Christianity. He said we are so aware of the moral law that we can't bare to face the fact that we've broken it. Consequently, we always put our bad behavior down to circumstances out of our control. We let ourselves off the hook somehow. But we put our good behavior down to ourselves.
We are far more likely to blame others than to blame ourselves. We have no shortage of good excuses for our own actions. But when other people behave badly (especially when they behave badly toward us), we are not so generous. We don't extend the benefit of the doubt as readily to others as we extend the justification to ourselves.
One of the examples Dagoods brought up was that Christians often misrepresent their opposition. This was an example of what he called "lack of honest inquiry." So basically he's accusing Christians of intellectual dishonesty when they misrepresent their opposition and then make strawman arguments. Isn't it interesting that his one explanation for these misrepresentations is "lack of honest inquiry"? He doesn't even raise the possibility that some Christians could honestly have misunderstandings about their opposition. To here Dagoods tell it, you'd think this was a problem unique to Christians. Christians are deceitful scum to dishonestly misrepresent non-believers, but of course non-believers never do that.
Let's pretend that we are observing a debate between a theist and an atheist on the existence of God. Both debaters have published a number of books and articles, and we have read them all. We have studied them thoroughly, and we are equally informed on both of their views. Now let's say that while we're observing this debate we notice that both of them misrepresent the other's position. What is our gut reaction to this?
That seems to depend on whose side we're on, doesn't it? We assume our guy made an honest mistake and maybe just doesn't really understand the other guy. But we assume the other guy is intentionally misrepresenting our guy. He's being intellectually dishonest. This is one case of how our bias can influence our conclusions about the motives of others.
We can't do away with our biases, but by being aware of them, we can make a more conscious effort to be fair. We shouldn't attribute motives to people without proper justification. We should be just as harsh with ourselves when we've done wrong as we are with others when they've done wrong. And if we're going to justify our own bad behavior, then we ought to be open to possible justifications for other people's bad behavior. Easier said than done, but if truth matters, then it's worth the effort, because truth requires consistency.
Dale Carnegie in How to Win Friends and Influence People made the same point. I don't have the book with me, but he argued in one of the first few chapters that people rarely ever blame themselves for anything. They always find some way to justify their actions. C.S. Lewis made the same observation in one of the first few chapters of Mere Christianity. He said we are so aware of the moral law that we can't bare to face the fact that we've broken it. Consequently, we always put our bad behavior down to circumstances out of our control. We let ourselves off the hook somehow. But we put our good behavior down to ourselves.
We are far more likely to blame others than to blame ourselves. We have no shortage of good excuses for our own actions. But when other people behave badly (especially when they behave badly toward us), we are not so generous. We don't extend the benefit of the doubt as readily to others as we extend the justification to ourselves.
One of the examples Dagoods brought up was that Christians often misrepresent their opposition. This was an example of what he called "lack of honest inquiry." So basically he's accusing Christians of intellectual dishonesty when they misrepresent their opposition and then make strawman arguments. Isn't it interesting that his one explanation for these misrepresentations is "lack of honest inquiry"? He doesn't even raise the possibility that some Christians could honestly have misunderstandings about their opposition. To here Dagoods tell it, you'd think this was a problem unique to Christians. Christians are deceitful scum to dishonestly misrepresent non-believers, but of course non-believers never do that.
Let's pretend that we are observing a debate between a theist and an atheist on the existence of God. Both debaters have published a number of books and articles, and we have read them all. We have studied them thoroughly, and we are equally informed on both of their views. Now let's say that while we're observing this debate we notice that both of them misrepresent the other's position. What is our gut reaction to this?
That seems to depend on whose side we're on, doesn't it? We assume our guy made an honest mistake and maybe just doesn't really understand the other guy. But we assume the other guy is intentionally misrepresenting our guy. He's being intellectually dishonest. This is one case of how our bias can influence our conclusions about the motives of others.
We can't do away with our biases, but by being aware of them, we can make a more conscious effort to be fair. We shouldn't attribute motives to people without proper justification. We should be just as harsh with ourselves when we've done wrong as we are with others when they've done wrong. And if we're going to justify our own bad behavior, then we ought to be open to possible justifications for other people's bad behavior. Easier said than done, but if truth matters, then it's worth the effort, because truth requires consistency.
Wednesday, March 01, 2006
How our presuppositions skew our interpretation of our experiences
Presuppositions are those background beliefs we all have that we don't really think about. Beliefs like, "My senses give me true information about the world," and "The future will resemble the past," are background beliefs that just about everybody has but that most people don't think about consciously. They just sort of automatically apply those beliefs to their experiences.
Sometimes our presuppositions are wrong. But whether they're right or wrong, they do have a big influence on how we filter information that comes our way. I was thinking about this last night and I came up with a real life example to explain what I mean.
Between the ages of 2 and 6, I lived in Abilene Texas. Out in west Texas, there aren't a whole lot of trees, but there is a whole lot of sky. With all that sky, lightening storms are pretty amazing.
Let me back up a little. I remember watching on TV where this guy had a model of the earth and the sun. He was explaining night and day. On his little model, half the earth was covered in a black shell. He rotated the shell around the earth to explain night and day.
Now the shell obviously just represented the night sky, but you have to understand how something like that would look to a five year old kid. (I'm guessing I was about 5 at the time). I thought the model literally represented the way things are. I thought there really was half a shell around the earth that rotates around the earth. The light was all around, but the shell blocked it out on half the earth.
This belief became a given to me. It acted like a presupposition. So one night I was sitting in the drive way looking at the lightening spreading out across the sky. It was pretty amazing. As I watched it, I tried to understand it, and I remember my thinking. I was looking at the black sky thinking, "That's the black shell around half the earth." Whenever I'd see the lightening, I'd also hear the thunder, and I started to draw some conclusions. I figured what was going on was that this shell was under some pressure because of the storm. Every now and then it would crack because of the pressure. The cracks would let in the light from the other side, and that's what lightening was. It would also make a loud sound when the shell cracked, and that was the thunder.
Another night, I was looking at the stars and thinking about the shell. I figured that shell must be really old since it's been rotating around the earth forever. Since it's so old, it's a bit tattered and has some holes in it. The light comes in through the holes, and that's the stars.
See how that one presupposition influenced my interpretation of my observations? That's the way presuppositions are. So we ought to try to be more conscious of them. Bad presuppositions will result in us drawing the wrong conclusions about reality and prevent us from drawing the right conclusions.
Sometimes our presuppositions are wrong. But whether they're right or wrong, they do have a big influence on how we filter information that comes our way. I was thinking about this last night and I came up with a real life example to explain what I mean.
Between the ages of 2 and 6, I lived in Abilene Texas. Out in west Texas, there aren't a whole lot of trees, but there is a whole lot of sky. With all that sky, lightening storms are pretty amazing.
Let me back up a little. I remember watching on TV where this guy had a model of the earth and the sun. He was explaining night and day. On his little model, half the earth was covered in a black shell. He rotated the shell around the earth to explain night and day.
Now the shell obviously just represented the night sky, but you have to understand how something like that would look to a five year old kid. (I'm guessing I was about 5 at the time). I thought the model literally represented the way things are. I thought there really was half a shell around the earth that rotates around the earth. The light was all around, but the shell blocked it out on half the earth.
This belief became a given to me. It acted like a presupposition. So one night I was sitting in the drive way looking at the lightening spreading out across the sky. It was pretty amazing. As I watched it, I tried to understand it, and I remember my thinking. I was looking at the black sky thinking, "That's the black shell around half the earth." Whenever I'd see the lightening, I'd also hear the thunder, and I started to draw some conclusions. I figured what was going on was that this shell was under some pressure because of the storm. Every now and then it would crack because of the pressure. The cracks would let in the light from the other side, and that's what lightening was. It would also make a loud sound when the shell cracked, and that was the thunder.
Another night, I was looking at the stars and thinking about the shell. I figured that shell must be really old since it's been rotating around the earth forever. Since it's so old, it's a bit tattered and has some holes in it. The light comes in through the holes, and that's the stars.
See how that one presupposition influenced my interpretation of my observations? That's the way presuppositions are. So we ought to try to be more conscious of them. Bad presuppositions will result in us drawing the wrong conclusions about reality and prevent us from drawing the right conclusions.
Subscribe to:
Posts (Atom)