Today I read a blog post by @joestump where he attempted to refute claims that Digg had been "doing it wrong" when they could not scale their database and went with a NoSQL attack instead. I'm not going to get into that flame war. I will say that folks at Digg should have expected exactly this kind of scrutiny when they decided to start bragging about their technology decisions. If a company has a technology blog, that is all they are trying to do -- brag. Anyways, back to the blog post... I was immediately disappointed to see that it was riddled with logical fallacies. I see logical fallacies all the time, and sometimes I forget that many folks are actually not very familiar with them. In particular, I would say that programmers are not very familiar with the "formal" concept, despite the fact that programmers tend to have very strong logical reasoning. Here is my attempt to do something about that.
First off, I am not going to answer what is a logical fallacy. You can follow that link, or find many other descriptions and enumerations of fallacies. Instead I want to talk about why programmers may not be aware of fallacies, and why it is important for them to learn about them. I think the reason that programmers may not know much about fallacies is that they are generally taught in classes on writing, public speaking, or debating. In other words, even though these are concepts tied to logical reasoning, including induction and deduction, they are not included in classes on mathematical logic. You cannot generally express a fallacy using symbols and computational expressions. This might lead some to believe that the fallacies are subjective, but this is not really true.
So why are fallacies important if they are not easily expressed mathematically, and thus difficult to represent programmatically? I suppose that if all you do is program on your own, then perhaps they are not relevant to you. If instead you work on a team with other programmers (or non-programmers for that matter,) then you will have ideas that are argued/debated. In this case, you need to understand fallacies -- so that you can express your arguments without committing a fallacy and so that you can recognize when people arguing against you are committing fallacies.
If you have an argument, and you find yourself committing a fallacy, then of course you will want to "fix" your argument to remove the error. This will inevitably draw out the assumptions in your argument, and strip out that which is not essential and true. Sometimes this will leave you with nothing, and you will be forced to question your own argument. Perhaps it was false or more likely mostly based on subjective statements, not fact. That can be a bummer, but wouldn't you rather realize that you are wrong than have somebody else point it out? Or worse, have nobody point it out...
You need to recognize when others are committing fallacies. At the very least you need to strike this from your own mental record of their argument. Does their argument make sense without the fallacy, or is it essential? Of course if you start noticing these kind of fallacies, then you may be tempted to point them out and use that to assert that their argument is false and thus your argument must be right. Oops, you just poisoned the well and committed a false dilemma.
Once you start noticing fallacies, you might notice that people commit them a lot. Sometimes this may seem to be true irrespective of the perceived intelligence of the committers. It is easy to make these mistakes when you are unaware of them. They are also much more common when people are "thinking on their feet." In fact, I had a professor who once joked that those who were good at thinking on their feet, were simply good at synthesizing fallacies. I'm not sure if that is a fair generalization, but you get the point. I think people are also much more likely to make these kind of errors when emotion has entered into the argument. I think that was the most likely case in the blog post that inspired this post.
No comments:
Post a Comment