For each exercise set in the book, answers are given for problems 1, 3, 5, 10, 15, 20, 25, and so on. The teachers manual (see Preface) has answers to the other problems.
Chapter 2 answers
2.1a
· 1. t is S
· 3. no L is B
· 5. all D is H
· 10. a is s
· 15. m is A
2.2a
· 1. This isn’t a syllogism, because “D” and “E” occur only once.
· 3. This isn’t a syllogism, because “Y” occurs three times and “G” occurs only once.
· 5. This isn’t a syllogism, because “Z is N” isn’t a wff.
2.2b
· 1. w is not s
· 3. no R is S
· 5. all P is B
2.2c
· 1. no P* is B* Invalid
· some C is not B*
· ∴ some C* is P*
· 3. no H* is B* Invalid
· no H* is D*
· ∴ some B* is not D
· 5. ∴ g* is g* Valid
· 10. all D* is A Invalid
· ∴ all A is D*
2.3a
· 1. all S* is D Valid
· all D* is U
· ∴ all S is U*
· 3. all T* is C Valid
· no C* is R*
· ∴ no T is R
· 5. all M* is R Valid
· some P is M
· ∴ some P* is R*
· 10. all S* is Y Invalid
· m is Y
· ∴ m* is S*
· 15. all N* is L Valid
· m is N
· ∴ m* is L*
· 20. b is W Invalid
· u is W
· ∴ u* is b*
· 25. some S is W Valid
· all S* is L
· all L* is H
· ∴ some W* is H*
2.3b
· 1. We can’t prove either “Bob stole money” or “Bob didn’t steal money.” 2 & 6 yield no valid argument with either conclusion.
· 3. 4 & 8 & 9 prove David stole money: “d is W, all W is H, all H is S ∴ d is S.”
· 5. This would show that our data was inconsistent and so contains false information. 0379
2.4a
· 1. all J is F
· 3. all S is R
· 5. some H is L
· 10. no S is H
· 15. all M is B
· 20. some H is not G
2.5a
· 1. “No human acts are free” or “No free acts are human acts.”
· 3. “Some free acts are determined” or “Some determined acts are free.”
· 5. No conclusion validly follows.
· 10. “No culturally taught racial feelings are rational” or “No rational thing is a culturally taught racial feeling.”
· 15. “Some who like raw steaks like champagne” or “Some who like champagne like raw steaks.”
· 20. “No basic moral norms are principles based on human nature” or “No principles based on human nature are basic moral norms.”
· 25. “No moral judgments are objective truths” or “No objective truths are moral judgments.”
2.6a
· 1. no B is C Valid
· all D is C
· ∴ no D is B
· 3. all E is F Valid
· some G is not F
· ∴ some G is not E
· 5. all A is B Valid
· all B is C
· ∴ all A is C
· 10. some V is W Invalid
· some W is Z
· ∴ some V is Z
2.7a
· 1. all R* is G Valid
· all G* is T
· all T* is V
· all V* is U
· ∴ all R is U*
· 3. g is A Valid
· all A* is R
· no R* is C*
· ∴ g* is not C
· 5. no S* is A* Valid
· all W* is A
· ∴ no S is W
· Premise 2 (implicit but false) is “All garments that should be worn next to the skin while skiing are garments that absorb moisture.”
· 10. all P* is O Valid
· all O* is E
· no M* is E*
· ∴ no M is P
· 15. e is C Invalid
· all S* is C
· ∴ e* is S*
· 20. all N* is C Valid
· no E* is C*
· g is E
· ∴ g* is not N
· Premise 3 (implicit) is “‘God exists’ is an existence claim.”
· 25. all D* is F Valid
· some P is not F*
· ∴ some P* is not D
Chapter 3 answers
3.1a
· 1. “Cop” is negative. “Police” is more neutral.
· 3. “Heroic” is positive. These are negative: “reckless,” “foolhardy,” “brash,” “rash,” “careless,” “imprudent,” and “daredevil.”
· 5. “Elderly gentleman” is positive. “Old man” is negative. 0380
· 10. “Do-gooder” is negative. “Person concerned for others” and “caring human being” are positive.
· 15. “Booze” is negative or neutral. “Cocktail” is positive, while “alcohol,” “liquor,” and “intoxicant” are neutral.
· 20. “Babbling” is negative. “Talking,” “speaking,” and “discussing” are neutral.
· 25. “Bribe” is negative. “Payment” and “gift” are neutral or positive.
· 30. “Whore” is negative. “Prostitute” is more neutral.
3.2a
· 1. A false statement that you think is true isn’t a lie.
· 3. (1) One who believes in God may not make God his or her ultimate concern. (2) One may have an ultimate concern (such as making money) without believing in God. (3) “Object of ultimate concern” is relative in a way that “God” isn’t: “Is there an object of ultimate concern?” invites the question “For whom?” – while “Is there a God?” doesn’t.
· 5. Since “of positive value” is no more clearly understood than “good,” this definition does little to clarify what “good” means. And there’s the danger of circularity if we go on to define “of positive value” in terms of “good.”
· 10. (1) If I believe that Michigan will beat Ohio State next year, it still might not be true. (2) If “true” means “believed,” then both these statements are true (since both are believed by someone): “Michigan will beat Ohio State next year” and “Michigan won’t beat Ohio State next year.” (3) “Believed” is relative in a way that “true” isn’t: “Is this believed?” invites the question “By whom?” – while “Is this true?” doesn’t.
· 15. This set of definitions is circular.
3.2b
· 1. This is true according to cultural relativism. Sociological data can verify what is “socially approved,” and this is the same as what is “good.”
· 3. This is true. The norms set up by my society determine what is good in my society, so these norms couldn’t be mistaken.
· 5. This is undecided. If our society approves of respecting the values of other societies, then this respect is good. But if our society disapproves of respecting the values of other societies, then this respect is bad.
· 10. This is true according to CR.
· 15. This is false (and self-contradictory) according to cultural relativism.
· 20. This is undecided, since cultural relativism leaves unspecified which of these various groups is “the society in question.”
3.4a
· 1. This is meaningful on LP (it could be verified) and PR (it could make a practical difference in terms of sensations or choices).
· 3. This is meaningful on both views.
· 5. This is probably meaningless on both views (unless the statement is given some special sense).
· 10. This is meaningless on LP (at least on the version that requires public verifiability). It’s meaningful on PR (since its truth could make a practical difference to Manuel’s experience).
· 15. Since this (LP) isn’t able to be tested empirically, it’s meaningless on LP. [To avoid this result, a positivist could claim that LP is true by definition and hence analytic (§3.6). Recall that LP is qualified so that it applies only to synthetic statements. But then the positivist has to use “meaningless” in the unusual sense of “synthetic but not empirical” instead of in the intended sense of “true or false.” This shift takes the bite out of the claim that a statement is “meaningless.” A believer can readily agree that “There is a God” is “meaningless” if all this means is that “There is a God” isn’t synthetic-but-not-empirical.] It’s meaningful on PR (its truth could make a difference to our choices about what we ought to believe).
3.5a
(These answers were adapted from those given by my students)
· 1. “Is ethics a science?” could mean any of the following:
· Are ethical judgments true or false independently of human feelings and opinions? Can the truth of some ethical judgments be known? 0381
· Can ethics be systematized into a set of rules that will tell us unambiguously what we ought to do in all (or most) cases?
· Can ethical principles be proved using the methods of empirical science?
· Is there some rational method for arriving at ethical judgments that would lead people to agree on their ethical judgments?
· Can a system of ethical principles be drawn up in an axiomatic form, so that ethical theorems can be deduced from axioms accessible to human reason?
· 3. “Is this belief part of common sense?” could mean any of the following:
· Is this belief accepted instinctively or intuitively, as opposed to being the product of reasoning or education?
· Is this belief so entrenched that subtle reasoning to the contrary, even if it seems flawless, has no power to convince us?
· Is this belief something that people of good “horse sense” will accept regardless of their education?
· Is this belief obviously true?
· Is this belief universally accepted?
[In each case we could further specify the group we are talking about – for example, “Is this belief obviously true to anyone who has ever lived (to all those of our own country, or to practically all those of our own country who haven’t been exposed to subtle reasoning on this topic)?”]
· 5. “Are values relative (or absolute)?” could mean any of the following:
· Do different individuals and societies disagree (and to what extent) on values?
· Do people disagree on basic moral principles (and not just on applications)?
· Are all (or some) values incapable of being proved or rationally argued?
· Is it wrong to claim that a moral judgment is correct or incorrect rather than claiming that it’s correct or incorrect relative to such and such a group? Do moral judgments express social conventions rather than truths that hold independently of such conventions?
· Do right and wrong always depend on circumstances (so that no sort of action could be always right or always wrong)?
· In making concrete moral judgments, do different values have to be weighed against each other?
· Are all things that are valued only valued as a means to something else (so that nothing is valued for its own sake)?
· 10. “Is that judgment based on reason?” could be asking whether the judgment is based on the following:
· Self-evident truths, the analysis of concepts, and logical deductions from these (reason versus experience).
· The foregoing plus sense experience, introspection, and inductive arguments (reason versus faith).
· Some sort of thinking or experience or faith (as opposed to being based on mere emotion).
· The thinking and experience and feelings of a sane person (as opposed to those of an insane person).
· An adequate and impartial examination of the available data.
· A process for arriving at truth in which everyone correctly following it would arrive at the same conclusions.
· What is reasonable to believe, or what one ought to believe (or what is permissible to believe) from the standpoint of the seeking of truth.
[We could be asking whether a given person bases his or her judgment on one of the foregoing, or whether the judgment in question could be based on one of the foregoing.]
· 15. “Do you have a soul?” could mean any of the following:
· Do you have a personal identity that could in principle survive death and the disintegration of your body?
· Are you capable of conscious thinking and doing?
· Would an exhaustive description of your material constituents and observable behavior patterns fail to capture important elements of what you are?
· Are you composed of two quite distinct beings – a thinking being without spatial dimensions and a material being incapable of thought?
· Are you capable of caring deeply about anything?
· Are you still alive?
3.6a
· 1. Analytic.
· 3. Synthetic.
· 5. Analytic.
· 10. Analytic.
· 15. Analytic.
· 20. Most philosophers think this is synthetic. St Anselm, Descartes, and Charles 0382 Hartshorne argued that it was analytic. See examples 3 and 4 of §6.7b, and examples 9 and 26 of §10.3b.
· 25. Most say synthetic, but some say analytic.
3.7a
· 1. A priori.
· 3. A posteriori.
· 5. A priori.
· 10. A priori.
· 15. A priori.
· 20. Most philosophers think this could only be known a posteriori. Some philosophers think it can be known a priori (see comments on problem 20 of the last section).
· 25. Most philosophers think this could only be known a priori, but a few think it could be known a posteriori.
Chapter 4 answers
4.2a
· 1. Complex question (like “Are you still beating your wife?”).
· 3. Pro–con. The candidate might be a crook. Or an opposing candidate might be even more intelligent and experienced.
· 5. Appeal to the crowd.
· 10. Genetic.
· 15. Appeal to authority.
· 20. None of the labels fit exactly. This vague claim (what is a “discriminating backpacker”?) is probably false (discriminating backpackers tend to vary in their preferences). The closest labels are “appeal to authority,” “appeal to the crowd,” “false stereotype,” or perhaps “appeal to emotion.” There’s some “snob appeal” here too, but this isn’t one of our categories.
· 25. Post hoc ergo propter hoc.
· 30. Appeal to opposition.
· 35. Appeal to emotion.
· 40. Post hoc ergo propter hoc.
· 45. Ad hominem or false stereotype.
· 50. Post hoc ergo propter hoc. The conclusion might still be true, but we’d need a longer argument to show this; many argue, for example, that Bush’s deregulation of banking caused the financial crisis.
· 55. Ambiguous.
· 60. Black and white, or complex question.
4.2b
· 1. Complex question.
· 3. Ambiguity.
· 5. False stereotype.
· 10. Appeal to authority.
· 15. Pro–con.
· 20. Genetic.
· 25. Black and white.
· 30. Ad hominem.
· 35. Appeal to the crowd.
· 40. Part–whole.
· 45. Appeal to authority, ad hominem, or appeal to emotion.
· 50. Circular.
· 55. Complex question.
· 60. Circular (but it still might be true).
4.3a
(The answers for 3 and 5 are representative correct answers; other answers may be correct.)
· 1. There are no universal duties.
· If everyone ought to respect the dignity of others, then there are universal duties.
· ∴ Not everyone ought to respect the dignity of others.
· 3. If we have ethical knowledge, then either ethical truths are provable or there are self-evident ethical truths.
· We have ethical knowledge.
· Ethical truths aren’t provable.
· ∴ There are self-evident ethical truths.
· 5. All human concepts derive from sense experience.
· The concept of logical validity is a human concept.
· ∴ The concept of logical validity derives from sense experience.
· 10. If every rule has an exception, then there’s an exception to this idea too; but then some rule doesn’t have an exception. Statement 10 implies its own falsity and hence is self-refuting.
· 15. If it’s impossible to express truth in human concepts, then statement 15 is false. Statement 15 implies its own falsity and hence is self-refuting.
4.4a
(These are examples of answers and aren’t the only “right answers.”)
· 1. If the agent will probably get caught, then offering the bribe probably isn’t in the agent’s self-interest.
· The agent will probably get caught. (One might give inductive reasoning for this.) 0383
· ∴ Offering the bribe probably isn’t in the agent’s self-interest.
· 3. Some acts that grossly violate the rights of some maximize good consequences (in the sense of maximizing the total of everyone’s interests).
· No acts that grossly violate the rights of some are right.
· ∴ Some acts that maximize good consequences aren’t right.
· 5. Any act that involves lying is a dishonest act (from the definition of “dishonest”).
· Offering the bribe involves lying (falsifying records, and the like).
· ∴ Offering the bribe is a dishonest act.
· 10. Science adequately explains our experience.
· If science adequately explains our experience, then the belief that there is a God is unnecessary to explain our experience.
· ∴ The belief that there is a God is unnecessary to explain our experience.
· Or: Science doesn’t adequately explain certain items of our experience (why these scientific laws govern our universe and not others, why our universe exhibits order, why there exists a world of contingent beings at all, moral obligations, and so on).
· If science doesn’t adequately explain certain items of our experience, then the belief that there is a God is necessary to explain our experience.
· ∴ The belief that there is a God is necessary to explain our experience.
· 15. The idea of logical validity is an idea gained in our earthly existence.
· The idea of logical validity isn’t derived from sense experience.
· ∴ Some ideas gained in our earthly existence don’t derive from sense experience.
Chapter 5 answers
5.2a
· 1. There are 32 such cards out of the 103 remaining cards. So your probability is 32/103 (about 31.1 percent).
· 3. Coins have no memory. The probability of heads is 50 percent.
· 5. The probability that Michigan will win the Rose Bowl is 80 percent times 60 percent times 30 percent, or 14.4 percent.
· 10. You get a number divisible by three 12 out of 36 times. You don’t get it 24 out of 36 times. Thus, mathematically fair betting odds are 2 to 1 (24 to 12) against getting a number divisible by three.
· 15. In 100 such cases, Ohio State would pass 60 times and run 40 times. If we set up to stop the pass, we’d stop them 58 times out of 100 [(60 • 70 percent) + (40 • 40 percent)]. If we set up to stop the run, we’d stop them 62 times out of 100 [(60 • 50 percent) + (40 • 80 percent)]. So we should set up to stop the run.
5.3a
· 1. You shouldn’t believe it. It’s only 12.5 percent (50 • 50 • 50 percent) probable.
· 3. You shouldn’t believe it. It’s 37.5 percent probable, since it happens in 3 of the 8 possible combinations.
· 5. You shouldn’t believe it. It’s not more probable than not; it’s only 50 percent probable.
· 10. You should buy the Enormity Incorporated model. If you buy the Cut-Rate model, there’s an expected replacement cost of $360 ($600 times 60 percent) in addition to the $600 purchase price. This makes the total expected cost $960. The expected cost on the Enormity Incorporated model is $900.
5.4a
· 1. This is a poor argument, since the sample has little variety.
· 3. This is a poor argument, since the sample is very small and lacks variety.
· 5. This is a good inductive argument (if you aren’t in the polar regions where the sun doesn’t come up at all for several weeks in the winter). In standard form, the argument goes: “All examined days are days when the sun comes up; a large and varied group of days has been examined; tomorrow is a day; so probably tomorrow is a day when the sun comes up.”
· 10. This weakens the argument. Some students cram logic mainly for the Law School Admissions Test (since this test contains many logic problems). You might not have known this, however.
5.5a
· 1. This doesn’t affect the strength of the argument, since the color of the book has little to do with the contents. 0384
· 3. This weakens the argument. It’s less likely that a course taught by a member of the math department would include a discussion of analogical reasoning.
· 5. This weakens the argument. An abstract approach that stresses theory is less likely to discuss analogical reasoning.
· 10. This weakens the argument. A book with only 10 pages on inductive reasoning is less likely to include analogical reasoning.
· 15. This weakens the argument, since it’s a significant point of difference between the two cases.
5.7a
· 1. Using the method of agreement, we conclude that either having a few drinks causes a longer reaction time, or having a longer reaction time causes a person to have a few drinks. The second alternative is less likely in terms of our background information. So we conclude that having a few drinks probably causes a longer reaction time.
· 3. The method of agreement seems to lead to the conclusion that the soda caused the hangover. However, we know that scotch, gin, and rum all contain alcohol. So soda isn’t the only factor common to all four cases; there’s also the alcohol. So the method of agreement doesn’t apply here. To decide whether the soda or the alcohol caused the hangover, Michelle would have to experiment with drinking soda but no alcohol, and drinking alcohol but no soda.
· 5. Using the method of agreement, we’d conclude that either factor K caused cancer or cancer caused factor K. If we found some drug to eliminate factor K, then we could try it and see whether it eliminates cancer. If eliminating factor K eliminated cancer, then it’s likely that factor K caused cancer. But if factor K came back after we eliminated it, then it’s likely that cancer caused factor K.
· 10. Using the method of disagreement, we’d conclude that eating raw garlic doesn’t by itself necessarily cause mosquitoes to stop biting you.
· 15. Using the method of agreement, we’d conclude that either the combination of factors (heating or striking dry matches in the presence of oxygen) causes the match to light, or else the lighting of the match causes the combination of factors. The latter is implausible (it involves a present fire causing a past heating or striking). So probably the combination of factors causes the match to light.
· 20. By the method of variation, it’s likely that an increase in the electrical voltage is the cause of the increase in the electrical current, or the electrical current is the cause of the electrical voltage, or something else caused them both. We know (but perhaps little Will doesn’t) that we can have a voltage without a current (such as when nothing is plugged in to our electrical socket) but we can’t have a current without a voltage. So we’d think that voltage causes current (and not vice versa) and reject the “electrical current is the cause of the electrical voltage” alternative. So we’d conclude that probably an increase in the electrical voltage is the cause of the increase in the electrical current, or else some other factor (Will’s curiosity, for example) caused both increases.
· 25. By the method of difference, wearing a single pair of socks probably is (or is part of) the cause of the blisters, or the blisters are (or are part of) the cause of wearing a single pair of socks. The latter is impossible, since a present event can’t cause a past event. So probably wearing a single pair of socks is (or is part of) the cause of the blisters. Since we know that we don’t get blisters from wearing a single pair of socks without walking, we’d conclude that wearing a single pair of socks is only part of the cause of the blisters.
5.8a
· 1. The problem is how to do the experiment so that differences in air resistance won’t get in the way. We could build a 100-foot tower on the moon (or some planet without air), drop a feather and a rock from the top, and see if both strike the ground at the same time. Or we might go to the top of a high building and drop rocks of different weights to see if they land at about the same time (with perhaps very minor time differences dues to minor differences in air resistance between rocks).
· 3. We could study land patterns (hills, rock piles, eccentric boulders, and so on) left by present-day glaciers in places like Alaska, compare land patterns of areas that we are fairly sure weren’t covered by glaciers, and compare both with those of Wisconsin. Mill’s method of agreement might lead us 0385 to conclude that glaciers probably caused the land patterns in Wisconsin. To date the glacier, we’d have to find some “natural calendar” (such as the yearly rings in tree trunks, yearly sediment layers on the bottoms of lakes, corresponding layers in sedimentary rocks, or carbon breakdown) and connect it with Wisconsin climatic changes or land patterns.
· 5. We could give both groups an intelligence test. The problem is that the first child might test higher, not because of greater innate intelligence, but because of differences in how the first and the last child are brought up. (The last child, but not the first, is normally brought up with other children around and by older parents.) To eliminate this factor, we might test adopted children. If we find that a child born first and one born last tend to test equally (or unequally) in the same sort of adoptive environment, then we could conclude that the two groups tend (or don’t tend) to have the same innate intelligence.
· 10. See the answer to problem 3. Any data making statement 3 probable would make 10 improbable. In addition, if we found any “natural calendar” that gives a strong inductive argument concerning any events occurring over 5,000 years ago, this also would make 10 unlikely. [Of course, these are only inductive arguments; it’s possible for the premises to be all true and conclusion false.]
Chapter 6 answers
6.1a
· 1. ∼(A • B)
· 3. ((A • B) ∨ C)
· 5. ((A ⊃ B) ∨ C)
· 10. (A ⊃ ∼(∼B • ∼C))
· 15. (∼(E ∨ P) ⊃ ∼R)
· 20. E [“(M ∨ F)” is wrong, since the English sentence doesn’t mean “Everyone is male or everyone is female.”]
6.2a
· 1. 1
· 3. 1
· 5. 0
· 10. 1
· 15. 0
6.3a
· 1. ∼(1 • 0) = ∼0 = 1
· 3. ∼(∼1 • ∼0) = ∼(0 • 1) = ∼0 = 1
· 5. (∼0 ≡ 0) = (1 ≡ 0) = 0
· 10. (∼1 ∨ ∼(0 ⊃ 0)) = (0 ∨ ∼1) = (0 ∨ 0) = 0
· 15. ∼((1 ⊃ 1) ⊃ (1 ⊃ 0)) = ∼(1 ⊃ 0) = ∼0 = 1
6.4a
· 1. (? • 0) = 0
· 3. (? ∨ ∼0) = (? ∨ 1) = 1
· 5. (0 ⊃ ?) = 1
· 10. (? ⊃ ∼0) = (? ⊃ 1) = 1
6.5a
· 1.
P Q |
(P ≡ ~Q) |
0 0 |
0 |
0 1 |
1 |
1 0 |
1 |
1 1 |
0 |
· 3.
P Q R |
(P ∨ (Q · ~R)) |
0 0 0 |
0 |
0 0 1 |
0 |
0 1 0 |
1 |
0 1 1 |
0 |
1 0 0 |
1 |
1 0 1 |
1 |
1 1 0 |
1 |
1 1 1 |
1 |
· 5.
P Q |
((P ≡ Q) ⊃ Q) |
0 0 |
0 |
0 1 |
1 |
1 0 |
1 |
1 1 |
1 |
6.6a
· 1. Invalid: second row has 110.
C D |
(C ⊃ D), |
D |
∴ |
C |
0 0 |
1 |
0 |
0 |
|
0 1 |
1 |
1 |
0 |
|
1 0 |
0 |
0 |
1 |
|
1 1 |
1 |
1 |
1 |
· 3. Valid: no row has 110.
T B |
(T ⊃ B), |
(T ⊃ ~B) |
∴ |
~T |
0 0 |
1 |
1 |
1 |
|
0 1 |
1 |
1 |
1 |
|
1 0 |
0 |
1 |
0 |
|
1 1 |
1 |
0 |
0 |
· 5. Invalid: row 4 has 1110. (I once got a group together but couldn’t get Grand Canyon backcountry reservations. So we instead explored canyons near Escalante, Utah. This made R = 0, T = 1, and E = 1.) 0386
· 10. Invalid: row 1 has 110.
S E R |
(S ⊃ (E · ~R)), |
~E |
∴ |
R |
0 0 0 |
1 |
1 |
0 |
|
0 0 1 |
1 |
1 |
1 |
|
0 1 0 |
1 |
0 |
0 |
|
0 1 1 |
1 |
0 |
1 |
|
1 0 0 |
0 |
1 |
0 |
|
1 0 1 |
0 |
1 |
1 |
|
1 1 0 |
1 |
0 |
0 |
|
1 1 1 |
0 |
0 |
1 |
6.7a
· 1. ∼(N1 ≡ H1) ≠ 1 Valid
· N1 = 1
· ∴ ∼H1 = 0
· 3. ((T ∨ M1) ⊃ Q0) ≠ 1 Valid
· M1 = 1
· ∴ Q0 = 0
· 5. ((L0 • F1) ⊃ S1) = 1 Invalid
· S1 = 1
· F1 = 1
· ∴ L0 = 0
· 10. (∼T0 ⊃ (P1 ⊃ J0)) ≠ 1 Valid
· P1 = 1
· ∼J0 = 1
· ∴ T0 = 0
· 15. A1 = 1 Valid
· ∼A1 ≠ 1
· ∴ B0 = 0
(An argument with inconsistent premises is always valid: if the premises can’t all be true, we can’t have premises all true and conclusion false. But such an argument can’t be sound, since the premises can’t all be true. This argument is controversial – see §17.2.)
6.7b
· 1. C Valid
· A
· ((C • A) ⊃ (F ∨ I))
· ∼I
· ∴ F
· 3. ((U • ∼R) ⊃ C) Valid
· ∼C
· U
· ∴ R
· 5. ((S • ∼M) ⊃ D) Valid
· ∼D
· S
· ∴ M
· 10. (I ⊃ (U ∨ ∼P)) Invalid
· ∼U
· ∼P
· ∴ I
· 15. ((M • S) ⊃ G) Valid
· S
· ∼G
· ∴ ∼M
· 20. ((I • ∼D) ⊃ R) Valid
· ∼D
· I
· ∴ R
6.8a
· 1. (S ⊃ (Y • I))
· 3. (Q ∨ R)
· 5. (∼T ⊃ ∼P)
· 10. (A ⊃ E) or, equivalently, (∼E ⊃ ∼A)
· 15. (S ⊃ W)
6.9a
· 1. (S ⊃ ∼K) Valid
· K
· ∴ ∼S
· The implicit premise 2 is “We can know something that we aren’t presently sensing.”
· 3. ((B • ∼Q) ⊃ O) Invalid
· ∼B
· ∴ ∼O
· 5. (S ⊃ A) Valid
· ∼A
· ∴ ∼S
· The implicit premise 2 is “The basic principles of ethics aren’t largely agreed upon by intelligent people who have studied ethics.”
· 10. (K ⊃ (P ∨ S)) Valid
· ∼P
· ∼S
· ∴ ∼K
· 15. (O ⊃ (H ∨ C)) Valid
· ∼C
· O
· ∴ H 0387
· 20. G Valid
· ∼S
· ((M • G) ⊃ S)
· ∴ ∼M
6.10a
· 1. P, U
· 3. no conclusion
· 5. ∼R, ∼S
· 10. H, I
· 15. no conclusion
· 20. no conclusion
6.11a
· 1. ∼T
· 3. ∼B
· 5. no conclusion,
· 10. no conclusion
· 15. F
· 20. Y
6.12a
· 1. ∼U
· 3. no conclusion
· 5. P, ∼Q
· 10. ∼A
· 15. no conclusion
Chapter 7 answers
7.1a
· 1. Valid
· 3. Valid
· 5. Valid
· 10. Valid
7.1b
· 1. Valid
· 3. Valid
· 5. Valid
· (This also could be translated without the NOTs – by letting “W,” for example, stand for “God doesn’t want to prevent evil.”)0388
· 10. Valid
7.2a
· 1. Invalid
· * 1 (A ∨ B)
· [∴ A
· 2 asm: ∼A
· 3 ∴ B {from 1 and 2}
∼A, B
· 3. Invalid
· 1 ∼(A • ∼B)
· [∴ ∼(B • ∼A)
· * 2 asm: (B • ∼A)
· 3 ∴ B {from 2}
· 4 ∴ ∼A {from 2}
B, ∼A
· 5. Invalid
· 1 ((A ⊃ B) ⊃ (C ⊃ D))
· * 2 (B ⊃ D)
· * 3 (A ⊃ C)
· [∴ (A ⊃ D)
· * 4 asm: ∼(A ⊃ D)
· 5 ∴ A {from 4}
· 6 ∴ ∼D {from 4}
· 7 ∴ ∼B {from 2 and 6}
· 8 ∴ C {from 3 and 5}
A, ∼D, ∼B, C
· 10. Invalid
· * 1 ∼(∼A • ∼B)
· 2 ∼C
· * 3 (D ∨ ∼A)
· * 4 ((C • ∼E) ⊃ ∼B)
· 5 ∼D
· [∴ ∼E
· 6 asm: E
· 7 ∴ ∼A {from 3 and 5}
· 8 ∴ B {from 1 and 7}
· 9 ∴ ∼(C • ∼E) {from 4 and 8}
∼C, ∼D, E, ∼A, B
7.2b
· 1. Invalid
· 1 (S ⊃ K)
· * 2 (M ⊃ K)
· 3 M
· [∴ S
· 4 asm: ∼S
· 5 ∴ K {from 2 and 3}
M, ∼S, K
· 3. Valid
· 5. Valid
· 10. Valid
· 15. Invalid
· * 1 (A ⊃ B)
· * 2 (B ⊃ (F ⊃ M))
· * 3 (M ⊃ ∼H)
· 4 H
· [∴ ∼A
· 5 asm: A
· 6 ∴ B {from 1 and 5}
· * 7 ∴ (F ⊃ M) {from 2 and 6}
· 8 ∴ ∼M {from 3 and 4}
· 9 ∴ ∼F {from 7 and 8}
An “F” premise would make it valid.
H, A, B, ∼M, ∼F
· 0389 20. Valid
· 25. Valid
7.3a
· 1. Valid
· 3. Valid
· 5. Valid
7.3b
· 1. Valid
· 3. Valid
· 5. Valid
· 10. Valid
7.4a
· 1. Invalid
· 1 ∼(A • B)
· [∴ (∼A • ∼B)
· ** 2 asm: ∼(∼A • ∼B)
· 3 asm: ∼A {break 1}
· 4 ∴ B {from 2 and 3}
∼A, B
· 3. Invalid
· 1 (A ⊃ B)
· 2 (C ⊃ (∼D • E))
· [∴ (D ∨ F)
· * 3 asm: ∼(D ∨ F)
· 4 ∴ ∼D {from 3}
· 5 ∴ ∼F {from 3}
· 6 asm: ∼A {break 1}
· 7 asm: ∼C {break 2}
∼D, ∼F, ∼A, ∼C
· 5. Invalid
· 1 (A ⊃ (B • C))
· ** 2 ((D ⊃ E) ⊃ A)
· [∴ (E ∨ C)
· * 3 asm: ∼(E ∨ C)
· 4 ∴ ∼E {from 3}
· 5 ∴ ∼C {from 3}
· 6 asm: ∼A {break 1}
· ** 7 ∴ ∼(D ⊃ E) {from 2 and 6}
· 8 ∴ D {from 7}
∼E, ∼C, ∼A, D
7.4b
· 1. Invalid
· 1 (M ⊃ ∼B)
· 2 ∼M
· 3 (B ⊃ (P • G))
· [∴ G
· 4 asm: ∼G
· 5 asm: ∼B {break 3}
∼M, ∼G, ∼B
· 3. Invalid
· 1 (∼R ⊃ (O • ∼S))
· [∴ (R ⊃ (C • S))
· * 2 asm: ∼(R ⊃ (C • S))
· 3 ∴ R {from 2}
· 4 ∴ ∼(C • S) {from 2}
· 5 asm: ∼C {break 4}
R, ∼C
· 5. Invalid
· 1 ((A • L) ⊃ (D • M))
· 2 (M ⊃ ∼C)
· 3 (S ⊃ L)
· ** 1[∴ ((A • ∼S) ⊃ C)
· * 4 asm: ∼((A • ∼S) ⊃ C)
· * 5 ∴ (A • ∼S) {from 4}
· 6 ∴ ∼C {from 4} 0391
· 7 ∴ A {from 5}
· 8 ∴ ∼S {from 5}
· ** 9 asm: ∼(A • L) {break 1}
· 10 ∴ ∼L {from 7 and 9}
∼C, A, ∼S, ∼L
· 10. Valid
· 15. Invalid
· 1 ((E • F) ⊃ W)
· 2 ((W • M) ⊃ (B • ∼N))
· [∴ (N ⊃ ∼E)
· * 3 asm: ∼(N ⊃ ∼E)
· 4 ∴ N {from 3}
· 5 ∴ E {from 3}
· ** 6 asm: ∼(E • F) {break 1}
· 7 ∴ ∼F {from 5 and 6}
· 8 asm: ∼(W • M) {break 2}
· 9 asm: ∼W {break 8}
N, E, ∼F, ∼W
Chapter 8 answers
8.1a
· 1. ∼Cx
· 3. (∃x)∼Cx
· 5. (x)Cx
· 10. ∼(∃x)(Lx • Ex)
· 15. (∃x)(Ax • (∼Bx • Dx))
· 20. ∼(x)(Cx ⊃ Px)
· 25. (x)(Cx • Lx)
8.2a
· 1. Valid
· 3. Valid
· 5. Valid
· 10. Valid
8.2b
· 1. Valid
· 3. Valid
· 5. Valid
· 10. Valid
8.3a
· 1. Invalid
o 1 (∃x)Fx
o [∴ (x)Fx
o * 2 asm: ∼(x)Fx
o 3 ∴ Fa {from 1}
o * 4 ∴ (∃x)∼Fx {from 2}
o 5 ∴ ∼Fb {from 4}
a, b
Fa, ∼Fb
· 3. Invalid
· * 1 (∃x)(Fx ∨ Gx)
· * 2 ∼(x)Fx
· [∴ (∃x)Gx
· * 3 asm: ∼(∃x)Gx
· * 4 ∴ (Fa ∨ Ga) {from 1}
· * 5 ∴ (∃x)∼Fx {from 2}
· 6 ∴ (x)∼Gx {from 3}
· 7 ∴ ∼Fb {from 5}
· 8 ∴ ∼Ga {from 6}
· 9 ∴ Fa {from 4 and 8}
· 10 ∴ ∼Gb {from 6}
a, b
Fa, ∼Ga, ∼Fb, ∼Gb
· 5. Invalid
· * 1 ∼(∃x)(Fx • Gx)
· 2 (x)∼Fx
· [∴ (x)Gx
· * 3 asm: ∼(x)Gx
· 4 ∴ (x)∼(Fx • Gx) {from 1}
· * 5 ∴ (∃x)∼Gx {from 3}
· 6 ∴ ∼Ga {from 5}
· 7 ∴ ∼Fa {from 2}
· 8 ∴ ∼(Fa • Ga) {from 4}
a
∼Ga, ∼Fa
· 10. Invalid
· * 1 (∃x)∼Fx
· * 2 (∃x)∼Gx
· [∴ (∃x)(Fx ≡ Gx)
· * 3 asm: ∼(∃x)(Fx ≡ Gx)
· 4 ∴ ∼Fa {from 1}
· 5 ∴ ∼Gb {from 2}
· 6 ∴ (x)∼(Fx ≡ Gx) {from 3}
· * 7 ∴ ∼(Fa ≡ Ga) {from 6}
· * 8 ∴ (Fa ∨ Ga) {from 7}
· 9 ∴ ∼(Fa • Ga) {from 7}
· 10 ∴ Ga {from 4 and 8}
· * 11 ∴ ∼(Fb ≡ Gb) {from 6}
· * 12 ∴ (Fb ∨ Gb) {from 11}
· 13 ∴ ∼(Fb • Gb) {from 11}
· 14 ∴ Fb {from 5 and 12}
a, b
Ga, ∼Fa, Fb, ∼Gb
8.3b
· 1. Invalid
· * 1 (∃x)(Bx • Gx)
· [∴ (x)(Bx ⊃ Gx)
· * 2 asm: ∼(x)(Bx ⊃ Gx)
· * 3 ∴ (∃x)∼(Bx ⊃ Gx) {from 2}
· * 4 ∴ (Ba • Ga) {from 1}
· 5 ∴ Ba {from 4}
· 6 ∴ Ga {from 4}
· * 7 ∴ ∼(Bb ⊃ Gb) {from 3}
· 8 ∴ Bb {from 7}
· 9 ∴ ∼Gb {from 7}
a, b
Ba, Ga, Bb, ∼Gb
· 0393 3. Invalid
· * 1 (∃x)Sx
· * 2 ∼(x)Cx
· [∴ (∃x)(Sx • ∼Cx)
· * 3 asm: ∼(∃x)(Sx • ∼Cx)
· 4 ∴ Sa {from 1}
· * 5 ∴ (∃x)∼Cx {from 2}
· 6 ∴ (x)∼(Sx • ∼Cx) {from 3}
· 7 ∴ ∼Cb {from 5}
· * 8 ∴ ∼(Sa • ∼Ca) {from 6}
· 9 ∴ Ca {from 4 and 8}
· * 10 ∴ ∼(Sb • ∼Cb) {from 6}
· 11 ∴ ∼Sb {from 7 and 10}
a, b
Sa, Ca, ∼Sb, ∼Cb
· 5. Valid
· 10. Valid
· 15. Invalid
· * 1 ∼(∃x)(Px • Bx)
· * 2 (∃x)(Cx • ∼Bx)
· [∴ (∃x)(Cx • Px)
· * 3 asm: ∼(∃x)(Cx • Px)
· 4 ∴ (x)∼(Px • Bx) {from 1}
· * 5 ∴ (Ca • ∼Ba) {from 2}
· 6 ∴ (x)∼(Cx • Px) {from 3}
· 7 ∴ Ca {from 5}
· 8 ∴ ∼Ba {from 5}
· 9 ∴ ∼(Pa • Ba) {from 4}
· * 10 ∴ ∼(Ca • Pa) {from 6}
· 11 ∴ ∼Pa {from 7 and 10}
a
Ca, ∼Ba, ∼Pa
8.4a
· 1. (Cg ∨ Eg)
· 3. ((x)Lx ⊃ (x)Ex)
· 5. ((∃x)Ex ⊃ R)
· 10. ((x)Ex ⊃ (x)(Lx ⊃ Ex))
· 15. ∼(∃x)Ex or, equivalently, (x)∼Ex
· 20. ∼(∃x)(Lx • Ex) or, equiv, (x)∼(Lx • Ex)
8.5a
· 1. Valid
· 3. Invalid
· * 1 ((x)Ex ⊃ R)
· [∴ (x)(Ex ⊃ R)
· * 2 asm: ∼(x)(Ex ⊃ R)
· * 3 ∴ (∃x)∼(Ex ⊃ R) {from 2}
· * 4 ∴ ∼(Ea ⊃ R) {from 3}
· 5 ∴ Ea {from 4}
· 6 ∴ ∼R {from 4}
· * 7 ∴ ∼(x)Ex {from 1 and 6}
· * 8 ∴ (∃x)∼Ex {from 7}
· 9 ∴ ∼Eb {from 8}
a, b
Ea, ∼Eb, ∼R
· 5. Invalid
a, b
Fa, ∼Ga, Gb
· 10. Invalid
· * 1 ∼(∃x)(Fx • Gx)
· 2 ∼Fd
· [∴ Gd
· 3 asm: ∼Gd
· 4 ∴ (x)∼(Fx • Gx) {from 1}
· 5 ∴ ∼(Fd • Gd) {from 4}
d
∼Fd, ∼Gd
· 0394 15. Valid
8.5b
· 1. Valid
· 3. Valid
· 5. Valid
· 10. Valid
· 15. Valid
· 20. Valid
Chapter 9 answers
9.1a
· 1. La
· 3. ∼a=p
· 5. (∃x)(∃y)(∼x=y • (Lx • Ly))
· 10. (∃x)(Lx • ∼(∃y)(∼y=x • Ly))
· 15. (Ra • ∼a=f)
9.2a
· 1. Invalid
· 1 Fa
· [∴ ∼(∃x)(Fx • ∼x=a) 0395
· * 2 asm: (∃x)(Fx • ∼x=a)
· * 3 ∴ (Fb • ∼b=a) {from 2}
· 4 ∴ Fb {from 3}
· 5 ∴ ∼b=a {from 3}
a, b
Fa, ∼Fb, ∼b=a
· 3. Valid
· 5. Invalid
· 1 ∼a=b
· 2 ∼c=b
· [∴ a=c
· 3 asm: ∼a=c
a, b, c
∼a=b, ∼a=c, ∼c=b
· 10. Invalid
· [∴ (∃x)(∃y)∼y=x
· * 1 asm: ∼(∃x)(∃y)∼y=x
· 2 ∴ (x)∼(∃y)∼y=x {from 1}
· * 3 ∴ ∼(∃y)∼y=a {from 2}
· 4 ∴ (y)y=a {from 3}
· 5 ∴ a=a {from 4}
a
a=a
9.2b
· 1. Valid
· 3. Valid
· 5. Invalid
· 1 ∼Bm
· 2 ∼Bu
· [∴ u=m
· 3 asm: ∼u=m
m, u
∼Bm, ∼Bu, ∼u=m
· 10. Valid
· 15. Valid
9.3a
· 1. (Lto • Lot)
· 3. (x)(Rx ⊃ Ltx)
· 5. ((x)Lxo • ∼(x)Lox)
· 10. (x)(Lxx ⊃ Lox)
· 15. (x)(Cgx ⊃ Lgx)
· 20. (x)(Cgx ⊃ Ggx)
9.4a
· 1. (x)(Rx ⊃ (y)Lyx) or, equivalently, (x)(y)(Ry ⊃ Lxy)
· 3. (∃x)(Rx • (∃y)Lyx) or, equivalently, (∃x)(∃y)(Ry • Lxy)
· 5. (x)(Rx ⊃ (∃y)(Iy • Lxy))
· 10. ∼(∃x)(Ix • (y)Lxy)
· 15. ((x)Ltx ⊃ (∃x)(Ix • (y)Lxy))
· 20. (x)(∃y)Cyx
· 25. (x)(y)(Cxy ⊃ Lxy)
9.5a
· 1. Invalid
· 1 (x)Lxa
· [∴ (x)Lax
· * 2 asm: ∼(x)Lax 0396
· * 3 ∴ (∃x)∼Lax {from 2}
· 4 ∴ ∼Lab {from 3}
· 5 ∴ Laa {from 1}
· 6 ∴ Lba {from 1}
a, b
Lab, Laa, ∼Lab
· 3. Invalid
· 1 (x)(y)(Lxy ⊃ x=y)
· [∴ (x)Lxx
· * 2 asm: ∼(x)Lxx
· * 3 ∴ (∃x)∼Lxx {from 2}
· 4 ∴ ∼Laa {from 3}
· 5 ∴ (y)(Lay ⊃ a=y) {from 1}
· 6 ∴ (Laa ⊃ a=a) {from 5}
a
∼Laa
· 5. Valid
· 10. Valid
· 15. Valid
9.5b
· 1. Valid
· 3. Invalid
· 1 Oab
· [∴ ∼Oba
· 2 asm: Oba
a, b
Oab, Oba
· To make it valid, we need the premise that “older than” is asymmetrical: “(x)(y)(Oxy ⊃ ∼Oyx)” –”In every case, if x is older than y, then y isn’t older than x.”
· 5. Invalid
· 1 (x)(∃y)Dxy
· [∴ (∃y)(x)Dxy
· * 2 asm: ∼(∃y)(x) Dxy
· 3 ∴ (y)∼(x)Dxy {from 2}
· * 4 ∴ (∃y)Day {from 1}
· 5 ∴ Dab {from 4}
· 6 ∴ ∼(x)Dxb {from 3}
· 7 ∴ (∃x)∼Dxb {from 6}
· Endless loop: we add further wffs to make the premise true and conclusion false. “∼Dab, ∼Dba, Daa, Dbb” also refutes the argument.
a, b
Dab, Dba, ∼Daa, ∼Dbb
· 10. Valid
· 15. Valid
· 20. Valid
· 25. Valid
Chapter 10 answers
10.1a
· 1. ☐G
· 3. ∼☐M
· 5. ☐(R ⊃ P)
· 10. Ambiguous: (R ⊃ ☐R) or ☐(R ⊃ R)
· 15. (A ⊃ ☐B)
· 20. ☐(H ∨ T)
· 25. (R ⊃ ☐E)
· 30. ☐(G ⊃ ☐G)
10.2a
· 1. Valid
· 3. Valid
· 5. Valid
· 10. Valid
10.2b
· 1. Valid
· 3. Valid
· 5. Valid
· 10. Valid
· 15. Valid
10.3a
· 1. Invalid
· * 1 ◇A
· [∴ ☐A
· * 2 asm: ∼☐A
· 3 W ∴ A {from 1}
· * 4 ∴ ◇∼A {from 2}
· 5 WW ∴ ∼A {from 4}
· 3. Invalid
· * 1 ◇A
· * 2 ◇B
· [∴ ◇(A • B)
· * 3 asm: ∼◇(A • B)
· 4 W ∴ A {from 1}
· 5 WW ∴ B {from 2}
· 6 ∴ ☐∼(A • B) {from 3}
· * 7 W ∴ ∼(A • B) {from 6}
· 8 W ∴ ∼B {from 4 and 7} 0399
· * 9 WW ∴ ∼(A • B) {from 6}
· 10 WW ∴ ∼A {from 5 and 9}
· 5. Invalid
· 1 (☐A ⊃ ☐B)
· [∴ ☐(A ⊃ B)
· * 2 asm: ∼☐(A ⊃ B)
· * 3 ∴ ◇∼(A ⊃ B) {from 2}
· * 4 W ∴ ∼(A ⊃ B) {from 3}
· 5 W ∴ A {from 4}
· 6 W ∴ ∼B {from 4}
· ** 7 asm: ∼☐A {break 1}
· ** 8 ∴ ◇∼A {from 7}
· 9 WW ∴ ∼A {from 8}
· 10. Invalid
· * 1 ∼☐A
· 2 ☐(B ≡ A)
· [ ∴ ∼◇B
· * 3 asm: ◇B
· * 4 ∴ ◇∼A {from 1}
· 5 W ∴ B {from 3}
· 6 WW ∴ ∼A {from 4}
· * 7 W ∴ (B ≡ A) {from 2}
· * 8 W ∴ (B ⊃ A) {from 7}
· 9 W ∴ (A ⊃ B) {from 7}
· 10 W ∴ A {from 5 and 8}
· * 11 WW ∴ (B ≡ A) {from 2}
· * 12 WW ∴ (B ⊃ A) {from 11}
· 13 WW ∴ (A ⊃ B) {from 11}
· 14 WW ∴ ∼B {from 6 and 12}
10.3b
· 1. Valid
· 3. Invalid
· 1 ☐(B ⊃ B)
· [ ∴ (B ⊃ ☐B)
· * 2 asm: ∼(B ⊃ ☐B)
· 3 ∴ B {from 2}
· * 4 ∴ ∼☐B {from 2}
· * 5 ∴ ◇∼B {from 4}
· 6 W ∴ ∼B {from 5}
· 7 ∴ (B ⊃ B) {from 1}
· 8 W ∴ (B ⊃ B) {from 1}
· 5. Invalid
· 1 ☐(R ⊃ F)
· 2 ☐(U ⊃ ∼R)
· [ ∴ ∼☐(F ⊃ U)
· 3 asm: ☐(F ⊃ U)
· 4 ∴ (R ⊃ F) {from 1}
· 5 ∴ (U ⊃ ∼R) {from 2}
· 6 ∴ (F ⊃ U) {from 3}
· 7 asm: ∼R {break 4}
· 8 asm: ∼F {break 6}
∼R, ∼F
· 10. Invalid
· 1 ☐(D ∨ ∼D)
· * 2 (☐D ⊃ ∼F)
· * 3 (☐∼D ⊃ ∼F)
· [ ∴ ∼F
· 4 asm: F
· * 5 ∴ ∼☐D {from 2 and 4}
· * 6 ∴ ∼☐∼D {from 2 and 4}
· * 7 ∴ ◇∼D {from 3}
· * 8 ∴ ◇D {from 4}
· 9 W ∴ ∼D {from 5}
· 10 WW ∴ D {from 6}
· 11 W ∴ (D ∨ ∼D) {from 1}
· 12 WW ∴ (D ∨ ∼D) {from 1}
· 13 ∴ (D ∨ ∼D) {from 1}
· 14 asm: D {break 13}
· 15. Valid
· 20. Valid
· 25. Valid
Chapter 11 answers
11.1a
· 1. Valid in B or S5.
· 3. Valid in S4 or S5.
· 5. Valid in S5.
· 10. Valid in B or S5.
· 15. Valid in S4 or S5.
11.1b
· 1. Valid in S5.
· 3. This side is Valid in S5.
· The other side is Valid in S4 or S5.
· 5. Valid in S4 or S5.
11.2a
· 1. (x)◇Ux
· 3. ☐Uj
· 5. (Ns • ◇∼Ns)
· 10. (x)(Nx ⊃ ☐Ax)
· 15. ◇(x)(Cx ⊃ Tx)
· 20. (∃x)☐Ux
11.3a
· 1. Valid
· 3. Valid
· 5. Valid
· 10. Valid
11.3b
· 1. Invalid
· 1 Bi
· [ ∴ ☐(x)(∼Bx ⊃ ∼x=i)
· * 2 asm: ∼☐(x)(∼Bx ⊃ ∼x=i)
· * 3 ∴ ◇∼(x)(∼Bx ⊃ ∼x=i) {from 2}
· * 4 W ∴ ∼(x)(∼Bx ⊃ ∼x=i) {from 3}
· * 5 W ∴ (∃x)∼(∼Bx ⊃ ∼x=i) {from 4}
· * 6 W ∴ ∼(∼Ba ⊃ ∼a=i) {from 5}
· 7 W ∴ ∼Ba {from 6}
· 8 W ∴ a=i {from 6}
· 9 W ∴ ∼(∼Bi ⊃ ∼i=i) {from 6 and 8}
· 10 W ∴ ∼Bi {from 7 and 8}
a, i
· 3. Valid
· 5. Invalid
· 1 ☐(∃x)Ux
· [ ∴ (∃x)☐Ux
· * 2 asm: ∼(∃x)☐Ux
· 3 ∴ (x)∼☐Ux {from 3}
· * 4 ∴ (∃x)Ux {from 1}
· 5 ∴ Ua {from 4}
· * 6 ∴ ∼☐Ua {from 3}
· * 7 ∴ ◇∼Ua {from 6}
· 8 W ∴ ∼Ua {from 7}
· * 9 W ∴ (∃x)Ux {from 1}
· 10 W ∴ Ub {from 1}
· Endless loop: add “∼Ub” to the actual world to make the conclusion false.
a, b
· 10. Valid
· 15. Valid (but line 11 requires S5 or B).
Chapter 12 answers
12.1a
· 1. (L ∨ S)
· 3. (A ⊃ W) or, equivalently, (∼W ⊃ ∼A)
· 5. ∼(A • B)
· 10. ((x)Ax ⊃ Au)
· 15. (B ⊃ ∼A)
· 20. (∃x)(Sx • Wx)
12.2a
· 1. Valid
· 3. Invalid
· 1 (A ⊃ B)
· [ ∴ (∼B ⊃ ∼A)
· * 2 asm: ∼(∼B ⊃ ∼A)
· 3 ∴ ∼B {from 2}
· 4 ∴ A {from 2}
· 5 asm: ∼A {break 1}
∼ B, A, ∼ A
· 5. Valid
· 10. Valid
12.2b
· 1. Valid
· 3. Valid
· 5. Valid
· 10. Valid
· 15. Invalid
· 1 (T ⊃ M)
· 2 T
· [ ∴ M
· 3 asm: ∼M
· 4 asm: ∼T {break 1}
T, ∼ M, ∼ T
· 20. Invalid
· 1 (x)(Hx ⊃ Ex)
· [ ∴ (x)(∼Ex ⊃ ∼Hx)
· * 2 asm: ∼(x)(∼Ex ⊃ ∼Hx)
· * 3 ∴ (∃x)∼(∼Ex ⊃ ∼Hx) {from 2}
· * 4 ∴ ∼(∼Ea ⊃ ∼Ha) {from 3}
· 5 ∴ ∼Ea {from 4}
· 6 ∴ Ha {from 4}
· 7 ∴ (Ha ⊃ Ea) {from 1}
· 8 asm: ∼Ha {break 7}
a
∼ Ea, Ha, ∼ Ha
12.3a
· 1. (A ⊃ O∼B)
· 3. (O∼A ⊃ ∼A)
· 5. ☐(A ⊃ RA)
· 10. O∼(B • ∼A)
· 15. (∼◇(x)Ax ⊃ O∼Au)
· 20. R(x)(∼Tx ⊃ Sx)
12.4a
· 1. Valid
· 3. Valid
· 5. Invalid
· [ ∴ O(A ⊃ OA)
· * 1 asm: ∼O(A ⊃ OA)
· * 2 ∴ R∼(A ⊃ OA) {from 1}
· * 3 D ∴ ∼(A ⊃ OA) {from 2}
· 4 D ∴ A {from 3}
· * 5 D ∴ ∼OA {from 3}
· * 6 D ∴ R∼A {from 5}
· 7 DD ∴ ∼A {from 6}
· 10. Valid
· 15. Valid
· 20. Invalid
· 1 O(x)(Fx ⊃ Gx) 0404
· 2 OFa
· [ ∴ OGa
· * 3 asm: ∼OGa
· * 4 ∴ R∼Ga {from 3}
· 5 D ∴ ∼Ga {from 4}
· 6 D ∴ (x)(Fx ⊃ Gx) {from 1}
· 7 D ∴ Fa {from 2}
· * 8 D ∴ (Fa ⊃ Ga) {from 6}
· 9 D ∴ ∼Fa {from 5 and 8}
· 10 ∴ ∼Fa {from 9 by indicative transfer}
· 25. Valid
12.4b
· 1. Valid
· 3. Valid
· 5. Invalid
· [ ∴ (OA ⊃ A)
· * 1 asm: ∼(OA ⊃ A)
· 2 ∴ OA {from 1}
· 3 ∴ ∼A {from 1}
· 4 ∴ A {from 2}
· 10. Invalid
· * 1 R(∃x)Ax
· [ ∴ (x)RAx
· * 2 asm: ∼(x)RAx
· * 3 D ∴ (∃x)Ax {from 1}
· * 4 ∴ (∃x)∼RAx {from 2}
· 5 D ∴ Aa {from 3}
· * 6 ∴ ∼RAb {from 4}
· 7 ∴ O∼Ab {from 6}
· 8 D ∴ ∼Ab {from 7}
· 15. Valid
· 20. Valid
· 25. Valid
Chapter 13 answers
13.1a
· 1. u:∼G
· 3. ∼u:G
· 5. ☐(u:G ⊃ ∼u:∼G)
· 10. ∼(u:A • u:∼A) 0405
13.2a
· 1. Valid
· 3. Invalid
· * 1 ∼◇(A • B)
· [ ∴ (u:A ⊃ ∼u:B)
· * 2 asm: ∼(u:A ⊃ ∼u:B)
· 3 ∴ ☐∼(A • B) {from 1}
· 4 ∴ u:A {from 2}
· 5 ∴ u:B {from 2}
· 6 u ∴ B {from 5}
· * 7 u ∴ ∼(A • B) {from 3}
· 8 u ∴ ∼A {from 6 and 7}
· 5. Invalid
· * 1 ∼◇(A • B)
· [ ∴ (u:∼A ∨ u:∼B)
· * 2 asm: ∼(u:∼A ∨ u:∼B)
· 3 ∴ ☐∼(A • B) {from 1}
· * 4 ∴ ∼u:∼A {from 2}
· * 5 ∴ ∼u:∼B {from 2}
· 6 u ∴ A {from 4}
· 7 uu ∴ B {from 5}
· * 8 u ∴ ∼(A • B) {from 3}
· 9 u ∴ ∼B {from 6 and 8}
· * 10 uu ∴ ∼(A • B) {from 3}
· 11 uu ∴ ∼A {from 7 and 10}
· 10. Valid
13.2b
· 1. Valid
· 3. Invalid
· 1 u:A
· [ ∴ ∼u:∼A
· 2 asm: u:∼A
· 3 u ∴ ∼A {from 2}
· 5. Invalid
· [ ∴ (u:A ∨ u:∼A)
· * 1 asm: ∼(u:A ∨ u:∼A)
· * 2 ∴ ∼u:A {from 1}
· * 3 ∴ ∼u:∼A {from 1}
· 4 u ∴ ∼A {from 2}
· 5 uu ∴ A {from 3}
· 10. Invalid
· [ ∴ (A ⊃ u:A)
· * 1 asm: ∼(A ⊃ u:A)
· 2 ∴ A {from 1}
· * 3 ∴ ∼u:A {from 1}
· 4 u ∴ ∼A {from 3}
13.3a
· 1. u:Sa
· 3. u:OSa
· 5. u:Sa
· 10. (u:OAu ⊃ Au)
· 15. (u:Axu ⊃ Aux)
13.4a
· 1. Valid
· 3. Invalid
· [ ∴ (u:Ba ∨ u:∼Ba)
· * 1 asm: ∼(u:Ba ∨ u:∼Ba)
· * 2 ∴ ∼u:Ba {from 1}
· * 3 ∴ ∼u:∼Ba {from 1}
· 4 u ∴ ∼Ba {from 2}
· 5 uu ∴ Ba {from 3}
· 5. Invalid
· 1 u:(x)OAx
· [ ∴ u:Au
· * 2 asm: ∼u:Au
· 3 u ∴ ∼Au {from 2}
· 10. Valid
13.4b
· 1. Valid
· 3. Valid
· 5. Valid
· 10. Invalid
· [ ∴ (u:Au ⊃ u:RAu)
· * 1 asm: ∼(u:Au ⊃ u:RAu)
· 2 ∴ u:Au {from 1}
· * 3 ∴ ∼u:RAu {from 1}
· * 4 u ∴ ∼RAu {from 3}
· 5 u ∴ O∼Au {from 4}
13.5a
· 1. Ou:Sa
· 3. Ru:OSa
· 5. (x)∼Rx:G
· 10. (Ou:x=x • (x=x • u:x=x))
· 15. (Ou:A ≡ ∼◇(u:A • ∼A))
· 20. (u:Axu ⊃ OAux)
· 25. ((∼Du • Ou:Bj) ⊃ Ou:Fj)
13.6a
· 1. Valid
· 3. Valid
· 5. Invalid
· 1 Oa:(C • D)
· [ ∴ Ob:C
· * 2 asm: ∼Ob:C
· * 3 ∴ R∼b:C {from 2}
· * 4 D ∴ ∼b:C {from 3}
· 5 Db ∴ ∼C {from 4}
· 6 D ∴ a:(C • D) {from 1}
· * 7 Da ∴ (C • D) {from 6}
· 8 Da ∴ C {from 7}
· 9 Da ∴ D {from 7}
· 10. Valid
13.6b
· 1. Valid
· 3. Valid
· 5. Valid
· 10. Valid
· 15. Valid
· 20. Invalid
· 1 O∼(u:A • ∼u:B)
· [ ∴ (u:A ⊃ u:B)
· * 2 asm: ∼(u:A ⊃ u:B)
· 3 ∴ u:A {from 2}
· 4 ∴ ∼u:B {from 2}
· 5 ∴ ∼(u:A • ∼u:B) {from 1}
· ** 6 asm: ∼u:A {break 5}
· 7 u ∴ ∼A {from 6}
· 25. Valid
· 30. Invalid
· 1 (x)Rx:A
· [ ∴ R(x)x:A
· * 2 asm: ∼R(x)x:A
· 3 ∴ O∼(x)x:A {from 2}
· * 4 ∴ Ra:A {from 1}
· 5 D ∴ a:A {from 4}
· * 6 D ∴ ∼(x)x:A {from 3}
· * 7 D ∴ (∃x)∼x:A {from 6}
· * 8 D ∴ ∼b:A {from 7}
· 9 Db ∴ ∼A {from 8}
· * 10 ∴ Rb:A {from 1}
· 11 DD ∴ b:A {from 10}
· Endless loop: add “∼a:A” to world DD to make the conclusion false. (You weren’t required to give a refutation.)
a, b
Chapter 14 answers
14.6
Impartiality formula – valid. (See footnote at the end of Chapter 14.)
Formula of universal law – valid. (See footnote at the end of Chapter 14.)