Thoughts on Boolean Laws of Thought

I first encountered symbolic logic when I was a teenager. Unfortunately, I had great trouble following the ostensible explanations that I encountered, and I didn't recognize that my perplexity was not because the underlying subject were intrinsically difficult for me, but because the explanations that I'd found simply weren't very well written. Symbolic logic remained mysterious, and hence became intimidating. And it wasn't clear what would be its peculiar virtue over logic expressed in natural language, with which I was quite able, so I didn't focus on it. I was perhaps 16 years old before I picked-up any real understanding of any of it, and it wasn't until years after that before I became comfortable not simply with Boolean expression but with processing it as an algebra.

But, by the time that I was pursuing a master's degree, it was often how I generated my work in economics or in mathematics, and at the core of how I presented the vast majority of that work, unless I were directed otherwise. My notion of an ideal paper was and remains one with relatively little natural language.

Partly I have that notion because I like the idea that people who know mathematics shouldn't have to learn or apply much more than minimal English to read a technical paper. I have plenty of praise for English, but there are an awful lot of clever people who don't much know it.

Partly I have that notion because it is easier to demonstrate logical rigor by using symbolic logic. I want to emphasize that word demonstrate because it is possible to be just as logically rigorous while expressing oneself in natural language. Natural language is just a notation; thinking that it is intrinsically less rigorous than one of the symbolic notations is like thinking that Łukasiewicz Polish notation is less rigorous than infixing notation or vice versa. I'll admit that some people may be less inclined to various sorts of errors using one notation as opposed to another, but which notation will vary amongst these people. However, other people don't necessarily see that rigor when natural language is used, and those who are inclined to be obstinate are more likely to exploit the lack of simplicity in natural language.

But, while it may be more practicable to lay doubts to rest when an argument is presented in symbolic form, that doesn't mean that it will be easy for readers to follow whatever argument is being presented. Conventional academic economists use a considerable amount of fairly high-level mathematics, but they tend to use natural language for the purely logical work.[1] And it seems that most of them are distinctly uncomfortable with extensive use of symbolic logic. It's fairly rare to find it heavily used in a paper. I've had baffled professors ask me to explain elementary logical transformations. And, at least once, a fellow graduate student didn't come to me for help, for fear that I'd immediately start writing symbolic logic on the chalk-board. (And perhaps I would have done so, if not asked otherwise.)

The stuff truly isn't that hard, at least when it comes to the sort of application that I make of it. There is a tool-kit of a relatively few simple rules, some of them beautiful, which are used for the lion's share of the work. And, mostly, I want to use this entry to high-light some of those tools, and some heuristics for their use.

First, though, I want to mention a rule that I don't use. A = A This proposition, normally expressed in natural language as A is A and called the Law of Identity, is declared by various philosophers to be one of the three Principles of logic. But I have no g_dd_mn'd idea what to do with it. It's not that I would ever want to violate it; it's just that I literally don't see anything useful to it. Ayn Rand and many of those for whom she is preceptrix treat it as an essential insight, but I think that it's just a dummy proposition, telling me that any thing can stand where that thing can stand.[2]

The remaining two Principles of logic are rather more appealing, but I seldom make use of them that is both direct and explicit.

There's the Law of Non-Contradiction, ~(P & ~P) for all P Now, I use proof by contradiction fairly often, but I don't explicitly haul-out the Law of Non-Contradiction when I do so. (Also, were all else equal, I'd avoid proof by contradiction, because I've often seen people flummoxed by it. But all else is rarely equal, and I don't generally delay work to find some more pleasing proof.)

Then there's the Law of the Excluded Middle, (P v ~P) for all P which I find myself using only even less directly than that of Non-Contradiction.

The tools that I use heavily and explicitly in symbolic logic are those that help me to find underlying parallels or symmetries that will allow me to reduce complex expressions to simpler expressions. Finding those parallels or symmetries is a matter of seeing how to turn conjunctions into disjunctions or vice versa, disjunctions into implications or vice versa, negations into affirmations or vice versa, so that expressions may be consolidated or analyzed and reconsolidated.

Of course a double negative is a positive, but people need to learn also to turn that around, when they need a negative, by converting positives into double negatives. P = ~~P

Redundant conjunctions or disjunctions can be simplified, but often simple propositions should be made redundantly P = (P & P & ... & P) P = (P v P v ... v P) to exhibit parallels in the structure containing them.

Implications can be converted into disjunctions with the protasis negated; disjunctions can be turned into implications with a negated term turned into a protasis. (P1 v P2) = (~P1 implies P2)

Conjunction and disjunction each distribute over the other. [P1 & (P2 v P3)] iff [(P1 & P2) v (P1 & P3)][P1 v (P2 & P3)] iff [(P1 v P2) & (P1 v P3)]

DeMorgan's Laws of Logic allow one to focus or spread negations, finding conjunctions where there were disjunctions and vice versa. ~(P1 v P2) iff (~P1 & ~P2)~(P1 & P2) iff (~P1 v ~P2)

Universal quantifiers can be conceptualized as conjunctions or (given certain philosophical commitments) as a generalization thereöf. [(Pi) for all Pi in {P1,P2,...,Pm}] = (P1 & P2 & ... & Pm) [forget it]

Existential quantifiers can be conceptualized as disjunctions or (given certain philosophical commitments) as a generalization thereöf. [(Pi) for some Pi in {P1,P2,...,Pm}] = (P1 v P2 v ... v Pm) [forget it]

Above, I've written the inclusions for these quantifiers in the way that mathematicians and economists usually write them, but it's important to remember that an inclusion for a universal quantifier corresponds to an implication, [(Pi) for all Pi in bold_P] = {[P1 if (P1 in bold_P] for all P1} and that for an existential quantifier corresponds to an conjunction, [(Pi) for some Pi in bold_P] = {[P1 & (P1 in bold_P] for some P1} (In fact, I have come to prefer to write my inclusions in terms of implications and conjunctions.)

De Morgan's Laws apply to the quantifiers thus: ~[(Pi) for some Pi in bold_P] = [(~Pi) for all Pi in bold_P] ~[(Pi) for all Pi in bold_P] = [(~Pi) for some Pi in bold_P]

About the only other rules that I consciously use are the commutative properties of conjunction and of disjunction, (P1 & P2) = (P2 & P1) (P1 v P2) = (P2 v P1)


[1] There's an idiotic notion amongst a great many mainstream economists that the Austrian School tradition is somehow less rigorous simply because some of its most significant members eschew overt mathematics in favor of logical deduction expressed in natural language. But most of the mainstream is likewise not using symbolic logic; neither is necessarily being less rigorous than otherwise. The meaning of variables with names such as qt can be every bit as muddled as those called something such as the quantity exchanged at this time. There are good reasons to object to the rather wholesale rejection of overt mathematics by many Austrian School economists, but rigor is not amongst the good reasons.

[2] (2010:09/01) I note that some authors use Law of Identity to refer not to a principle that A can stand wherever A can stand for all A, but to a rule (P if only and only if P) for all P However, this is no more than a compressed expression of [(P only if P) & (P if only P)] for all P which is an alternate way of expressing [(P only if P) & (P only if P)] for all P which is a redundant expression of (P only if P) for all P which is just different expression of (P v ~P) for all P which is just the Law of the Excluded Middle, rather than some further principle.

Comments