Philosophical Origins of Computers (Continued)

Historical Motivations for Symbolic Logic and Set Theory

The general purpose of this site has been to introduce a preliminary level of conceptual or historical theories, systems, or disciplines to a wider audience.  We try and assume that there is not any background in these areas and attempt, in some imperfect way, to relate them to some practical level of interest.  However, I feel some subjects are really worth getting more into the details and I hope that those that view this site will be understanding of that occasional indulgence.  Set theory is one of my favorite things, it is something I feel a genuine passion for and so I would like to take a little bit more time to not only discuss it’s historical development and practical motivation but to really try and introduce some of its structure and notation along the way (without Trin or Tom yelling at me; they haven’t posted in a while so I’m trying to get away with it).  I will try and break these posts up between more accessible and more detailed to those readers with varying levels of interest.

Symbolic Logic: The Limitations of Syllogistic Reasoning

So what is set theory and why should anyone care?  And, what is it’s practical motivation, including ultimate relationship to computers, as indicated from the previous themed post?  And, for that matter, what is the connection to Boolean algebra?


Very briefly, we begin with Aristotle (384 BCE – 322 BCE) in ancient Greece, who first discovered logic in a formal way.  In his Prior and Posterior Analytics, Aristotle outlines logic as an organization of proper rules for reasoning and making justified decisions.  It is in this context that Aristotle introduces syllogistic reasoning, built upon two premises deriving one conclusion.  These premises use quantifying words, that is, words that bind the meaning of the rest of the argument (input or premise) to the meaning of the quantifying word, such as “all”, “some”, or “none”, insofar as such words articulate a clear description of the nature of the argument.  Thus, if

Premise 1: “All Zonks are Zucks” and,
Premise 2: “All Zucks are Zarks”, then it necessarily follows,
Conclusion: “All Zonks are Zarks”

Through syllogistic reasoning we see that the form of the argument is transitive, that is, no matter what the meanings or lack of meanings of the words in the real world the form of the argument universally applies to any values places into the function of this syllogism.  This was a revolutionary discovery and formed the basis of logical analysis for over 2,000 years!  More or less, ancient Greece’s contributions of Euclidean geometry and Aristotelian logic were not fundamentally reformed until the 19th century!

Hilariously Invalid Syllogism

And guess what?  Just as surprising, the motivation was the same in both cases for their dramatic structural expansion.  So a quick tangent, what is non-Euclidean geometry?  And for that matter, what is Euclidean geometry?  Well, I am really glad you asked me.  Euclidean geometry was founded by the ancient Greek Euclid (3rd century BCE, exact dates unknown) and wrote Elements, which is most famous for but he wrote many other things too.  However, this work, in particular, has been the foundation of geometry for much of civilization, and systematized a logical and proof-oriented approach to mathematics that served as the basis of future formalization.  But the primary difference between Euclidean and non-Euclidean geometry is the curvature of space.  The lines and spatial objects of Euclidean geometry are envisioned upon the assumption of a vacuum, insofar as the nature of the surrounding space is dimensionless beyond length, width, and depth of the objects themselves.  Non-Euclidean geometry, such as Riemannian integral calculus, assumes the spatial background of the topographical field to be curved in its dimensions.  As you may have guessed already, non-Euclidean geometry was a necessary condition for the mathematical language to generate Einstein’s theory of relativity.  Since the physical universe is spatially curved a Newtonian mechanical system built upon a Euclidean geometric field was an inaccurate mathematical description of the science.

At the same time as mathematics was being revolutionized in the 19th century, logic was undergoing major changes thanks to Boole, DeMorgan, Peirce, and a generation of logicians that came to realize the significant shortcomings of what Aristotelian logic was capable of validly modeling.  This had been hinted before in previous centuries, such as Immanuel Kant’s desire to generate a transcendental logic (which he never did) and G.W.F. Hegel’s Science of Logic explicated a series of dialogical relations that attempted to unify nature in the idea of the Absolute.  But in the end it was not the idealists but the empirical realists that did the formalization of logic.  At the time, a primary motivating factor was geometry.  How can one logically describe the mathematics of extension of body in an Aristotelian syllogism?  Exactly.  Thus, if logic is the laws of thought, and if mathematics cannot be grounded in logical principles, then there must be a shortcoming in the field of logic.

Symbolic logic is a collection of symbols and functions that formalize valid inference (argument).  The other purpose of logic is axiomatization, that is, to create a model by which to represent the nature and structure of the form of some type of phenomena.  These two purposes are related but often have different emphases in their application.  For example, while quantified modal logic is built upon a foundation of valid inference, its primary motivation is to model aspects of world description as bound to the conceptual notions of ‘possibility’ or ‘necessity’.  On the other hand classical first order logic is so broad as to be the basic foundation of how valid inference works but in this universality is generally not a good model for applying it to things by itself.  For example, undergraduate philosophers are taught that symbolic logic (propositional calculus) is a model for natural language (English, German, Farsi, etc.) but this is only partially true, insofar as the grammatical structure of a language can be modeled in terms of valid or invalid inference.  However, the meaning of words, the morphology of language, the underlying holistic conceptual patterns of language are far beyond the ability of first order logic to model.  Indeed, first order logic can’t even handle in its axiomatization a sentence that starts with self-reference, namely “I like to go to the park”.  When doing computation, this is an important difference to remember between the computation or logic of the code of grammar versus the model structure used to represent the semantic patterns of language.  They are related but they also create many paradoxes if a weak logic is used to model natural language when it’s incapable of handling such complexity.

Set Theory

Now set theory originally arose out of a logic for structuring quantity, namely number theory.  If you recall from contemporary mathematical courses you learn about the set of integers N (0, 1, 2, 3…n+1) or the set of real numbers R (set of all decimals and fractions), et cetera.  The analysis of these sets helped to give rise to the difference between concepts such as countably infinite versus uncountably infinite sets of elements.  At any rate, a set is a collection of elements.  Although set theory was originally established for study of the theory of number, it has been applied to all manner of inquiries, essentially anything that has elements from words, to concepts, to sets of functions.  It is an incredibly versatile and fundamental system of analytical tools with many intuitive and counter-intuitive notions.

Set theory arose in the 19th century and underwent major refinement in the 20th century that gave rise to axiomatic set theory.  We could spend a few semesters discussing this topic.  However, suffice to say, several mathematicians generated set theory in the 1870’s, and discovered a number of problems in its application, namely paradoxes such as “the set of all sets” (is that set inside a set or outside the set of all sets, thus making it a higher set, and thus in turn requiring another set, ad infinitum).  To get around these issues, a system was developed slowly over time in the early 20th century called the Zermelo-Frankel axiomatic set theory, (ZF) and later refined further to the Zermelo-Frankel choice axiomatic set theory (ZFC) that included the axiom of choice as one of its finite set of axioms.  The point of these investigations was to provide the logical foundation for the field of mathematics and are recognized today as providing that, insofar as it was originally quite controversial in the early 20th century but no longer is.

It was in this spirit of the times that the mathematician and philosopher Alfred North Whitehead (1861-1947) and Lord Bertrand Russell (1872-1970) constructed a complex system of logical proofs for the sake of grounding mathematical operations in the Principia Mathematica.  It is nearly impossible to overstate the historical significance of this book, despite the fact that few contemporary philosophers seem to have ever read it.  The notation is outdated by today’s standards but it is a touchstone in the developments of English speaking philosophy.  But this book inspired persons from Kurt Godel, Alonzo Church, to W.V.O. Quine (Whitehead was his dissertation advisor).

Leave a comment

Filed under Academia, Analytic, Bertrand Russell, History, Intellectual, Logic, Mathematics, Philosophy, Uncategorized

Banach-Tarski Paradox

I am going to take a quick respite before I continue and just briefly mention, since I alluded to it in the post on George Boole, the mathematical theory that underpins all work on probability: measure theory.  I absolutely love mathematicalmonk’s youtube videos.  He has over 120 of them on everything from information entropy to cumulative distribution to Markov chains.  But he introduces a long series of videos with the Banach-Tarski paradox, and while not going through the proofs, offers a great justification for why we need measurable sets in topography.  I’d like to share this video and see what other people think of it.  But it also keeps with the theme of this series of posts since Alfred Tarksi (1901-1983) was a Polish philosopher and mathematician.  He wrote extensively on issues of the semantic conception of truth and formal semantics, important issues in the philosophy of language.  His most successful student was Richard Montague (1930-1971), who wrote the Montague Grammar.  He was also murdered by asphyxiation in California and I believe a play was recently released on his life leading up to that.  The case remains unsolved.  Anyway…his referential work on meaning and grammar is pretty interesting!  So without further ado…


Leave a comment

Filed under Analytic, Logic, Mathematics, Philosophy, Probability & Statistics, Uncategorized

Philosophical Origins of Computers

As indicated by my last post, I’d really like to tie in philosophical contributions to mathematics to the rise of the computer.  I’d like to jump from Leibniz to Boole, since Boole got the ball rolling to finally bring to fruition what Leibniz first speculated on the possibility.

In graduate school, I came across a series of lectures by a former head of the entire research and development division of IBM, which covered, in surprising level of detail, the philosophical origins of the computer industry. To be honest, it’s the sort of subject that really should be book in length.  But I think it really is a great contemporary example of exactly what philosophy is supposed to be, discovering new methods of analysis that as they develop are spun out of philosophy and are given birth as a new independent (or semi-independent) field their philosophical origins.  Theoretical linguistics is a good example, particularly linguistic pragmatics, which has only been an independent field since the late 1960’s, when the philosopher H.P. Grice founded the field with his work on conversational implicatures.  In truth, Grice is studied more in linguistics today than he is in philosophy.  But really, this is a sign of a real legacy, as it serves as a good example of the practical value of philosophy at its best.  And while political science, sociology, psychology, physics, and many other fields were founded by philosophers, I feel for our generation the philosophical origination of the computer age and of the field of computer science seems to speak to a particularly valuable practical level to all of us and so I feel it’s a story worth telling.  But it’s also one reason why I feel from this history that future generations of philosophers have a very important role to play with artificial cognition and its developments.

Well that’s all tangential, the upshot is that both philosophers and non-philosophers could learn something from examples of the theoretical nature of philosophical investigation deriving new disciplines of empirical inquiry.  We really should begin with the British philosopher and mathematician George Boole (1815-1864).

George Boole

In 1854, Boole published an extraordinary book, which I read after graduate school out of interest and it’s quite worth a look: An Investigation of the Laws of Thought on Which are Founded the Mathematical Theories of Logic and Probabilities.  It’s been a few years since I read the book after graduate school but if I recall its three main components: (1) a significant introduction outlining an empirical theory of the mind based upon the relationship of cognitive structures and thoughts built upon inferential reasoning of logic as its foundation; (2) an algebra of logic detailing operators including for syllogisms and for predicates; and (3) an investigation of probability, logic, and the mind about the world.  Boole was firmly an empiricist philosopher, like almost every British philosopher, and is credited as being the father of the algebra of logic.  However, one note about the book, Boolean algebra is not found in it because the book does not outline a procedure for the fundamental concepts of union, intersection, and complement in modern Boolean algebra.

However, what is noteworthy is tracing binary code back to Boole, insofar as the valuation of his logic is built upon the truth (1) or falsity (0) of what is logically related.  This was further developed in the 1854 book, however, the primary outline was established in the 1847 paper “The Mathematical Analysis of Logic.”  Here Boole creates an algebra of logic built upon the three operators of AND, OR, and NOT.  These operators were later extended by others in modern Boolean algebra where the real value of these thoughts came together with contributions by the economist William Stanley Jevons and the philosopher and founder of American pragmatism Charles Sanders Peirce.  Peirce, in particular, did considerable work in the logic of relations, and this included union, intersection, and complement of the algebra of sets.  Here union corresponded to AND/OR, intersection to AND, pairwise disjunction to OR and NOT to complements.  This similarly is what gave rise to DeMorgan’s law in this context.  The upshot for our purposes, however, is that these operators and their truth/false values is the structure by which all binary code is built upon.

From a theoretical level, and the standpoint of probability, all probability theory is founded upon a field of mathematics known as measure theory.  Measure theory, is built upon a sigma algebra whereby a probability measure is a special case of the real line, whereby its range of possible value to be measured is limited to the open interval [0, 1].  However, the way in which probability measures are constructed relies fundamentally on the relationships of events.  That is to say, in other words, the Boolean algebra of events models, in the case of a conditional probability, the chance that event A will occur given that event B occurred.  We can represent the relationships of these events in Venn Diagrams which show the union of the two.  This is an outgrowth of the original work of George Boole by some very important mathematicians.  This would be a very interesting topic to discuss on its own.

But why is this important and what is the relationship to computers?  I would like to explore that in the next post in discussing the road to symbolic logic, which then led to the general theory of computation.  I would then like to double back to Boole and discuss the origins of information theory and tie the two together.

1 Comment

Filed under Academia, Analytic, Boole, History, Intellectual, Logic, Mathematics, Probability & Statistics, Uncategorized

Philosophical Origins of Mathematics (Continued)

Gottfried Leibniz (1646 – 1716) is often said to be the last human being that knew everything.  While he remains controversial in philosophy for his rationalist epistemology, there is no denying that he was one of the greatest geniuses, if not the greatest, in human history.  While a considerable body of Leibniz’s work remains to be published, the range of his contributions to the world run the gambit from the creation of war plans to invade Egypt that generations later no less than Napoleon Bonaparte successfully employed to generating the proofs for calculus.  While it is true that the natural philosopher Issac Newton discovered much of the same mathematical analysis that Leibniz did, Newton was never able to generate the level of proof that Leibniz did.  To this day we use much of Leibnizian notation for Riemannian calculus (dx/dy; dt; for example).

Gottfried Leibniz, Philosopher and Mathematician

Leibniz originally conceived of mathematical limit in calculus as representing a descendancy of rectangles to an infinitely small size.  Nonetheless, he was mainly concerned with calculus as being a relationship of finite sums and differences in geometric space.  But of course limit is the most fundamental concept of calculus.  However, calculus underwent a number of upgrades since its origins, particularly thanks to the German mathematician Weierstrass, a name that would be familiar to philosophers in mathematical logic.   Undoubtedly, the most important contribution Leibniz gave us were systematization and the proofs for establishing the relationship of integral and differential functions.  This is now known as the fundamental theorem of calculus, though there are technically two primary aspects of it.


The idea was simply that by taking all the changes of something over time (differential) that the summation of those changes will add up to the total change (integral) of something.  Thus, from the nature of a limit over continuity (opposed to something discrete), the derivative of a differential function is necessarily conjoined with its antiderivative which is simply its integral.


A major catalyst for Leibniz’s interest in these considerations stemmed from the concept of infinity.  Leibniz dedicated hundreds of pages to infinity.  Indeed, Yale I believe released a book on it which is over on my shelf.  Let me see, just a second, ok I’m back.  Ah yes, “The Labyrinth of the Continuum: Writings on the Continuum Problem, 1672-1686.  One can see that in this colossal book the relationship to Leibniz deep, penetrating thought between mathematics and metaphysics.  It covers everything from unbounded extensions to motion and transformation of change.  Leibniz discusses that space and motion are ultimately forms of relations, which is a motivating factor for his pursuit of differential calculus.  Similarly, Leibniz covers very different subjects concerning the relation of souls and minds, and the infinite degrees of “souls”.  One may be surprised to learn that in this context Leibniz applied certain analytical notions of integration.  Or at least that’s how I interpreted him.  I’ve never really talked about this to anyone before so to be honest it’s just between you and me.

I must admit I really like Leibniz.  I know you’re not supposed to say that out loud to philosophers but he was such an original thinker and his extension of abstract analysis is essentially unparalleled by anyone within several centuries of him in either direction.  He also was hundreds of years ahead of his time in conceiving of the possibility of symbolic logic, or computer language systems.  But this brings me to one of the things I most love about 19th century and early 20th century philosophy, which is its principle contributions to the invention of the computer and the digital age.  While contemporary empirical philosophers found themselves at odds with much of Leibnizian philosophy, the underlying conception of mathematics as the primary model structure by which to analyze reality and knowledge was continued by George Boole, Bertrand Russell, Alonzo Church, Alan Turing, and many others.  It is in this sense that we particularly appreciate the unique work that Leibniz developed over his lifetime.

Leave a comment

Filed under Academia, Analytic, History, Intellectual, Leibniz, Mathematics, Philosophy, Uncategorized

Philosophical Origins of Mathematics (Continued)

Continuing the remarks from the previous post, I felt it would be interesting to give a general historical overview of some of the relationship of the creation and development of mathematics by famous philosophers.  Unfortunately, this is not something generally taught in undergraduate philosophy courses, which is a real shame.  In truth, it can be rather arduous getting young people to appreciate or respect modern philosophers without seeing their contributions that have withstood the evolution of epistemology and metaphysics over the same spectrum of time.  When the only thing you are taught as an introduction is that some philosopher generated an epistemologically skeptical theory that’s proposed solution was inherently circular (Descartes) or that some other guy had this abstract general theory of windowless monads as the basis of their physical conception of reality (Leibniz) you may not realize that over the lifetime of being a student how much you have relied on these people for your mathematical education.

Now one can trace this relationship all the way back to ancient Greece.  Plato’s Meno is one dialogue that comes to mind.  Similarly, Aristotle devoted much time to deductive logic.  The relationship also plays a role in medieval thought, particularly in the confines of metaphysical questions of realism and nominalism.  However, mathematics (and in my view philosophy as well) were rather primitive during these time periods so I would like to really fast forward to a point in time when formalization of mathematical theory was really making progress.

Major Contributor of Analytic Geometry

Rene Descartes (1596-1650) published in 1637 his original work on the gestating method of conjoining algebra and geometry into a shared model of analysis.  Starting from the side of geometry, Descartes developed a variety of functions to represent his work on the Cartesian coordinate plane named after him.  If you have ever seen this before you can thank the French philosopher.

The x-y coordinate Cartesian plane named after Rene Descartes.

Alongside the Cartesian plane is the Cartesian product, developed in the binary relations, or ordered pairs, of an (x,y) coordinate plane.  The Cartesian product is the product, or summation, of all possible ordered pairs.  If A is a set of numbers (or more generally elements) and B is as well, then the Cartesian product is A × B.  For example,

A = {1,2}; B= {3,4}
A × B = {1,2} × {3,4} = {(1,3), (1,4), (2,3), (2,4)}
B × A = {3,4} × {1,2} = {(3,1), (3,2), (4,1), (4,2)}

The Cartesian product is an analytical tool that can range from such simplicity to things truly complex, depending on the quantitative measure of the group of sets involved from morphisms, to graphs, to functions.  Another simple binary relation involves injective, surjective, or bijective mapping of elements of sets.

Surjective Cartesian Product

Descartes’ later work also is now considered to be the first example of a theory of relative motion that preceded developments in the birth of modern physics.  Tangentially, Newton did not understand the censorship Descartes was writing under and incorrectly interpreted him as denying relative motion.  Newton spent an embarrassing amount of time writing against Descartes under this assumption.  Similarly, Descartes made many other mathematical contributions in the formulation of polynomials in geometric terms and other related areas of analytic geometry.  But it is important to realize that the analysis that mathematics promises is of particular importance to the philosophical enterprise.  And as the conceptual space of analysis widens so too does philosophy.

For those more interested in this subject, Descartes published a large and significant body of work called Discourse on Method that covers these topics in considerable depth.  The appendix containing the analytic geometry is not what a contemporary mathematician would consider to meet the standard of formalization, nor are conditions of consistency or completeness of theory given.  However, these mathematical procedures were not yet developed in the century Descartes lived and needs to be charitably taken into historical account.

Leave a comment

Filed under Academia, Analytic, Descartes, History, Intellectual, Mathematics, Philosophy, Uncategorized

Second Order Reality: The Relationship of Mathematics and Philosophy

Introductory Remarks

The great thing about the internet is being able to share ideas, which is great unless you can’t really afford to share them.  The last several months I’ve been working on the applications of graduate level mathematics to various aspects of cognitive-linguistic questions.  It occurred to me today, however, that it would be worth taking a break to discuss the deep relationship between philosophy and mathematics.  In my experience, it often genuinely surprises people the connection.  But in truth I can see why.  In popular culture we generally overuse the word philosophy as providing a “way of life” to guide practical decisions or interpretations of things that happen to us.  Daoism or Confucianism comes to mind here.  Or generally we may just think of philosophy as pursuing the meaning of life in some affective or mystical sort of penetrating way.  Prussian philosopher Immanuel Kant once said that science is organized knowledge and wisdom is organized life.  In this understanding, for those that have studied academic philosophy, we can see that contemporary philosophy is defined more by investigations in the latter than the former.

So: why mathematics?  From a conceptual standpoint, there are numerous reasons but I would like to discuss what I feel are the two most fundamental: (1) commonality in the nature and structure of the method of analysis; and (2) the shared intellectual desire to find the fundamental source of knowledge and the shared rational value of determining the essential structure of things.  We could spend a lot of time on these.  But really in a basic way it’s pretty straightforward.

(1) Philosophy and mathematics are “second order” fields, their investigations do not necessarily apply to the mechanics of the physical world but instead more broadly cover the complete spectrum of what is necessary, what is possible, and what is impossible.  Generally, the humanities, social sciences, and natural sciences cover “first order” investigations into real objects or events (states of affairs), and do everything in the capacity of their methods to separate out possibility from actuality.  They generally give a finite set of principles that provide laws or perhaps just interpretations for how some set of data is defined, ordered, explained, and perhaps predicted.  Mathematics, as well as philosophical logic, serves as ways of modeling the structure of these other fields, they serve as a way of analyzing the real world.  Generally, mathematics does so syntactically, while philosophy adds a conceptual level of analysis.  Hence, we have the philosophy of language, philosophy of science, et cetera; as specialized sets of analyses on first order organizations of knowledge of the real world.  But the important thing to understand is that philosophical and mathematical analyses are “pure” or “theoretical”, whereas generally analysis conducted in other fields (i.e. behavioral analysis in psychology or technical analysis in valuations and predictions of equity markets) are applications of empirical data to empirical outcomes.

(2) Philosophers have often been attracted to mathematics because of the nature of proof as the highest standard for knowledge.  With mathematics one achieves precision in the deductive relations of mathematical structure.  The hope has often been that the further one can extend mathematics the more sound the foundations of knowledge.  In many ways, this pursuit has defined a substantial amount of modern philosophy as epistemology since the 16th century has often been tied in to this desire.

Leave a comment

Filed under Academia, Analytic, Intellectual, Mathematics, Philosophy, Uncategorized

Science and Morality: Sam Harris’ “Science Can Answer Moral Questions” TED Talk

One of my primary areas of interest is the field of ethics.  Specifically, for the last few years I have been extremely interested in the relationship between science and ethics. Can or should scientific findings inform our ethical theories? In what ways should our findings in psychology and neuroscience shape our ethical judgements of right and wrong or of moral innocence and guilt?

Given this interest, I was quite excited to stumble across a 2010 TED Talk by Sam Harris entitled “Science Can Answer Moral Questions.” For those unfamiliar with Harris, he is a neuroscientist/philosopher who is probably best known for his criticism of religion and as a co-founder of Project Reason. In 2010 he wrote a book entitled The Moral Landscape in which he proposes that science can, in principle, offer answers to moral questions.

The talk I link to below basically gives a brief summary of the view argued for in The Moral Landscape. I find the proposal rather interesting and am sympathetic to his arguments, though I do have some concern that the ethical theory he sketches (which he explicitly notes in the book is a form of utilitarianism) faces some serious complications and objections.

That said, I thought I would share the TED Talk in the hopes that it will lead to some interesting discussions. I personally am quite interested to hear about what others think of his definition of “value” and of his project in general.


1 Comment

Filed under Uncategorized