Logic of relations
The logic of relations is a philosophical framework that examines how two or more propositions—statements or concepts—interact and influence each other's truth. It explores various types of relationships, such as contradictory, contrary, transitive, reflexive, modality, and conditional relations. For instance, contradictory propositions assert opposing truths, while transitive logic connects three propositions in a manner that if the first two are true, the third must also be true. The origins of this concept trace back to mathematician Augustus De Morgan, who sought to address the limitations of earlier logical systems. Charles Sanders Peirce further developed these ideas, incorporating algebra into logical analysis, which allowed for deeper insights into relational reasoning. This area of study not only plays a critical role in philosophy but also intersects with mathematics and computer science, impacting various fields through its principles of reasoning and argumentation. Understanding the logic of relations can enhance critical thinking skills and provide clarity in assessing the validity of statements and their interconnections.
On this Page
Logic of relations
Logic of relations is a way of considering whether two or more statements or concepts—often called propositions—make sense when compared to and against one another. For example, if a person says, "I always tell the truth," and then says, "I'm lying," a logician would say these two statements are related because for one to be true the other must be false. Philosophers and logicians use the logic of relations when examining whether there is a connection between two propositions and how that affects the truth of each statement.
![Diagram of an example database according to the relational model By U.S. Department of Transportation vectorization: Own work (Data Integration Glossary.) [Public domain], via Wikimedia Commons 89144275-114879.jpg](https://imageserver.ebscohost.com/img/embimages/ers/sp/embedded/89144275-114879.jpg?ephost1=dGJyMNHX8kSepq84xNvgOLCmsE2epq5Srqa4SK6WxWXS)
![Relational model concepts. By User:AutumnSnow (Own work) [GFDL (http://www.gnu.org/copyleft/fdl.html) or CC-BY-SA-3.0 (http://creativecommons.org/licenses/by-sa/3.0/)], via Wikimedia Commons 89144275-114880.jpg](https://imageserver.ebscohost.com/img/embimages/ers/sp/embedded/89144275-114880.jpg?ephost1=dGJyMNHX8kSepq84xNvgOLCmsE2epq5Srqa4SK6WxWXS)
Background
The concept of logic of relations has its origins in mathematics. It is generally credited to mathematician and logician Augustus De Morgan, an India-born Englishman. De Morgan originated a number of important mathematical terms, theories, and systems, but these are not considered as significant as his contribution to logic.
His innovations were laid out in a book titled Formal Logic or, The Calculus of Inference, Necessary and Probable (1847). In it, he addresses a problem with the older logical concepts that had been in use since the days of Aristotle. De Morgan proposed a way to deal with propositions that were not easily addressed under more absolute forms of logic. For instance, if one is analyzing statements about dogs in a dog park, where one proposition is that most of the dogs are brown and another is that most of the dogs are running, another statement can be made that some dogs are both brown and running. The logical analysis determined that there was a relationship between the statements.
The concept of logic of relations is strongly associated with logician Charles Sanders Peirce. American-born Peirce grew up in a household with a Harvard mathematician father who frequently hosted highly educated men who were well versed in mathematics, science, politics, and writing. The effect of this on Peirce was that he had a strong and curious intellect; however, he was not disciplined in the classroom. After graduation, he landed several jobs, including a long-standing position with the US Coastal Survey. This allowed him to indulge his passion for logic.
In 1870, Peirce presented a paper titled "Description of a Notation for the Logic of Relatives, Resulting from an Amplification of the Conceptions of Boole's Calculus of Logic." The paper expanded on De Morgan's concepts, which Peirce thought were incomplete. Peirce went on to apply the use of algebraic rules and principles to solve logic problems. This allowed the use of multiple quantifiers, or words that provide information about the extent of the subject of a statement (in the above example where most dogs are brown and running, most is a quantifier).
Peirce had a number of students who continued his work. These included John Dewey, Christine Ladd-Franklin, Allan Marquand, and Oscar Howard Mitchell. They made their own contributions to logic, mathematics, and even computer science.
Overview
Logic is a part of philosophy that investigates questions of reasoning. Logicians begin with one set of statements or concepts, often called propositions, and work systematically to a conclusion about these statements. One way they do this is by looking for patterns of relationship between the propositions, such as the ways they are alike or different, and how these relationships lead to the conclusion about the statements.
There are a number of ways for two propositions to be related in logic problems. Some of these include contradictory, contrary, transitive, and reflexive. They can also be in modality or in conditional relation.
Contradictory propositions are a set of statements that cannot both be false or both be true at the same time. The statements include different quantifiers and are also phrased so one is a positive statement and one is negative. One of the statements will be true and the other false. For example, "Every apple is a fruit; some apples are not fruit."
Contrary propositions deal with the same subject but make conflicting statements about it. An example would be, "All apples are fruit; no apples are fruit." In this case, both statements can be false or one can be false and one true, but both cannot be true.
Transitive logic statements have three propositions. They are related so that whatever correlation is found between the first and the second statements and the second and third statements must also hold true between the first and the third statements. For instance, if statement one is, "All apples are fruits," and statement two is, "All fruits are juicy," and both of them are deemed true, then the third statement, "All apples are juicy," must also be true.
Reflexive statements are propositions that refer to themselves. One example would be, "I always lie." These statements provide interesting logical problems because of their reflexive nature. For instance, if a person says, "I always lie," the logician is left to figure out if a person who claims to always lie is telling the truth about lying.
Modality statements use qualifiers such as "necessarily" or "possibly," or words that reflect these conditional states. For example, "You may fail the test if you do not study," is an example of a modality; "You will fail the test if you do not study" would be the same statement phrased for classic or traditional logic. Modalities allow logicians to consider situations that involve uncertainties and possibilities.
A conditional relation, sometimes called a logical implication, includes a relationship in which if one part of the statement is true than the other part must be true as well. For example, "People who study pass tests. I studied, so I am going to pass." In these statements, the truth of one part implies the truth of the other part; or, if one part is false, the other part is also false.
Bibliography
Bellucci, Francesco, and Ahti-Veikko Pietarinen. "Charles Sanders Peirce: Logic." Internet Encyclopedia of Philosophy, www.iep.utm.edu/peir-log/. Accessed 21 Jan. 2017.
Burch, Robert. "Charles Sanders Peirce." Stanford Encyclopedia of Philosophy, 12 Nov. 2014, plato.stanford.edu/entries/peirce/. Accessed 21 Jan. 2017.
De Tienne, Andrè. "Peirce's Logic of Information." Universidad de Navarra, 28 Sept. 2006, www.unav.es/gep/SeminariodeTienne.html. Accessed 21 Jan. 2017.
"Giants of Science: Charles Sanders Peirce." NOAA History, www.history.noaa.gov/giants/peirce.html. Accessed 21 Jan. 2017.
Goodwin, Geoffrey P., and P. N. Johnson-Laird. "Reasoning about Relations." Psychological Review, vol. 112, no. 2, pp. 468–93, mentalmodels.princeton.edu/papers/2005reasoningrelations.pdf. Accessed 21 Jan. 2017.
Oppenheimer, Paul E. and Edward N. Zalta. "Relations versus Functions at the Foundations of Logic: Type-Theoretic Considerations." Metaphysics Research Lab, Stanford University, mally.stanford.edu/Papers/rtt.pdf. Accessed 21 Jan. 2017.
Sun-Joo, Shin. "Peirce's Deductive Logic." Stanford Encyclopedia of Philosophy, 9 Aug. 2013, plato.stanford.edu/entries/peirce-logic/. Accessed 21 Jan. 2017.
"10.7. De Morgan, Augustus (1806–1871)." Seton Hall University, pirate.shu.edu/~wachsmut/ira/history/demorgan.html. Accessed 21 Jan. 2017.