Synechepedia

Logic

Table of Contents

WIP

What

… this is a way the mind moves, from the first two to the third.

(Hart 2010)

On Formalization

Sapir said all systems leak; he was referring to the fact that no grammatical system has ever successfully captured a real natural language, but it is natural to generalize his slogan to formalizations of any complex natural sign system. There are always “loose ends” … Thus we cannot expect our semiotic models to be perfect. However, a precise description that is somewhat wrong is better than a description so vague that no one can tell if it’s wrong. We do not seek to formalize actual living meanings, but rather to express our partial understandings more exactly. Precision is also needed to build computer programs that use the theory. I do not believe that meaning in the human sense can be captured by formal sign systems; however, human analysts can note the extent to which the meanings that they see in some sign system are preserved by different representations. Thus we seek to formalize particular understandings of analysts, without claiming that such understandings are necessarily correct, or have some kind of ideal Platonic existence.

(Goguen 1998)

Why

Jean-Yves Girard

An old activity like logic can find its justification neither in the preservation of a rather obsolete tradition, nor in technical developments, no matter how heroic and brilliant they might be. Its meaning should be sought in questions of a true logical nature, i.e., dealing with the fundamentals of reasoning. As a central task, the building of a non-fregean theory of cognition, the benchmark for such an endeavor being an updated version of incompleteness: to prove, once and for all, that questions are not the same thing as answers, i.e., the inexistence of those unlikely X-rays of knowledge.

(Girard 2011)

TODO How

Recycling

Logic, surely born essentialist, began with manipulating universal rules. Long afterwards, at the end of a complex process, we eventually find an underlying structure for these rules; and thus, to require that a proof of A ∨ B be […] a proof of A or a proof of B. One eventually rediscovers the existence under essence. But, while studying this existence, one reinstalls essence.

We started with the rules of logic; we arrived at the logic of rules and it turns out to be the logic we started with. … Circularity is therefore not only a tarskian void, it is also a sign of harmony. But one cannot content oneself with that!

(Girard 2011, 139)

Some patterns seem to cycle – spiraling – through the thoughts and techniques discernable in the roots and margins of modern logic. These notes record a winding thread I’ve been picking at. The thread runs from syllogistic logic, through 20th century proof theory, and continues to coil through the proof-theoretic formal systems that have been transforming computer science.

TODO Structural Proof Theory

… structural systems may be considered as providing a general framework of consequence, in terms of which specific logical systems can be defined.

(Schroeder-Heister 2002, 3)

A Note on Genealogy

Paul Hertz is the progenitor of the structural analysis of proofs. Gentzen utilized this approach to found structural proof theory (see (Legris 2012) and (Schroeder-Heister 2002)). Hertz published his work, On Axiomatic Systems for Arbitrary Systems of Sentences, in 1922. In a short biography by Hilbert’s collaborator Paul Barneys, Barneys described the contribution thus:

Hier führte er viele methodisch wichtige Begriffsbildungen ein und gelangte zu mathematisch prägnanten Ergebnissen. Diese Untersuchungen sind Vorläufer verschiedener neuerer Forschungen zur mathematischen Logik und Axiomatik, insbesondere hat G. Gentzens Sequenzenkalkul von den H.schen Betrachtungen über Satzsysteme seinen Ausgang genommen.

Here he introduced many methodologically important conceptual developments and arrived at mathematically incisive results. These investigations were forerunners of various new researches into mathematical logic and axiomatics, in particular, G. Gentzen’s sequent calculus took its starting point from Hertz’s work on sentence systems.1

Gentzen’s work was, according to a reductive formulation, the application of Hertz’s approach and core methodology to Hilbert’s Program. The broadly conceived field of constructivistic proof theory, meant to encompass efforts in constructivistic automated theorem proving, type theory, and programming language theory is inconceivable without Gentzen’s work (though we can easily imagine a more worthy person than the Nazi Gentzen having done that work).

TODO Inference and The “Sentence” Gadget in Hertz’s Essays on Sentence Systems

The beginning (and kernel) of Hertz’s On Axiomatic Systems…:

Whenever a system of sentences is recognized to be valid, it is often not necessary to convey each and every sentence to memory; it is sufficient to choose some of them from which the rest can follow. Such sentences, as it is generally known, are called axioms. The choice of these axioms is to a certain degree arbitrary. One can ask, however, if the property of a system of sentences to have several axiom systems is interconnected with other remarkable properties, and if there are systematic approaches to find, as the case may be, that axiomatic system which contains the least possible number of sentences. In the following some thoughts shall be communicated, which might be useful as a pre-stage for the treatment of these or related problems.

In fact the actual problem of interest is so entangled, that initially it seems appropriate to be content with an immense simplification: We only consider sentences of a certain type, sentences that we can write symbolically: (a1 , . . . , an ) → b and that can be expressed linguistically by formulations such as: If (a1 , . . . , an ) altogether holds, so does b. In addition, a second simplification will be introduced in the present first part, by only considering sentences of type a → b; however, we will liberate ourselves from this limitation in a following part. Further we assume rules according to which from certain sentences other ones follow: So, e.g., the validity of the sentences a → b, b → c should result in the holding of the sentence a → c.

However, what is actually meant by such a sentence, what the symbol means in the combination of characters a → b or the word ‘if’ in the corresponding linguistic formulation, does not have to be indicated here.

(Hertz and Legris 2012)

Recapitulation: Hertz aimed to analyze systems of sentences determined by a transitive “follows” relationship. As a simplification, he narrowed his focus to sentences of the form (a1, ..., an) → b, taken to mean “if a1 and … and an is true then b is true”. He left the meaning of all these parts undetermined, including what the characters referred to (he calls them “elements” throughout the essay), what the means, and even what the word ’if’ means.

However, in a footnote, he reveals a critically important interpretation:

It might be added though, that our sentences a → b are nothing other than formal “implications” in the sense of Russell (Whitehead and Bertrand 2005, 22), and that the scheme of inference used as a base in the first part is the Theorem listed by Russell as No. 10, 3 [*10·3], p. 150, or put differently: Our sentences are judgements of subsumptions, our inferences are syllogisms of modus Barbara.

(Hertz and Legris 2012, 12)

Russell and Whitehead’s “formal implications” are the propositions stated by universally quantified implications: ∀x.Sx → Px. Russell and Whitehead explicitly equate these with the constructs that Hertz refers to as “judgments of subsumption”. The latter are the best known variety of judgments from classical term logic, exemplified in the immemorial truth “All humans are mortal” and schematized as “All S are P”. Hertz is hinting at the fact that, when we do move to assign meaning to the symbols of his formalism, we might read a → b as All a are b or, if we prefer modern predicate logic, ∀x.a(x) → b(x).

Barbara is the classical syllogism

  All S are P
  All P are Q
∴ All S are Q

This is equivalent to the transitivity of the “follows” relation (which we might also restate as (S → P & P → Q) → (S → Q)), where each sentence of the form A → B is a “formal implication”.

What is inference?

Inference. The process of inference is as follows: a proposition “\(p\)” is asserted, and a proposition “\(p\) implies \(q\)” is asserted, and then as a sequel, the proposition “q” is asserted. The trust in inference is the belief that if the two former assertions are not in error, the final assertion is not in error. Accordingly whenever, in symbols, where \(p\) and \(q\) have of course special determinations,

“\(\vdash p\)” and “\(\vdash (p \supset q)\)”

have occurred, then “\(\vdash q\)” will occur if it is desired to put it on record. The process of inference cannot be reduced to symbols. Its sole record is the occurrence of “\(\vdash q\)”. It is of course convenient, even at the risk of repetition, to write “\(\vdash p\)” and “\(\vdash(p \supset q)\)” in close juxtaposition before proceeding to “\(\vdash q\)” as a result of the inference. When this is to be done, for the sake of drawing attention to the inference which is being made, we shall write instead

“\(\vdash p \supset \vdash q\),”

which is to be considered as a mere abbreviation of the threefold statement

“\(\vdash p\)” and “\(\vdash (p \supset q)\)” and “\(\vdash q\).”

Thus “\(\vdash p \supset \vdash q\)” may be read “\(p\), therefore \(q\),” being in fact the same abbreviation, essentially, as this is; for “\(p\), therefore \(q\)” does not explicitly state, what is part of its meaning, that \(p\) implies \(q\), an inference is the dropping of a true premiss; it is the dissolution of an implication [emphasis mine].

(Whitehead and Bertrand 2005, 9)

  • TODO Note the different meaning of the tunstyle here.
  • TODO recapitulate and indicate significance
TODO Cut is Barbara (find Hertz example of this)
TODO Structural reasoning is using TFL to formalize MPL

Can sequents in general (i.e., with multiple antecedents) still be read as judgments of subsumption?

all [syntax objects] are [syntax objects]

This is tough (but worth fighting for).

Easier to see are the formal implications. (where each syntatic object is predicated as “is true”, this is ML’s point).

Need to explain the move to multiple antecedents.

TODO Do structural rules perhaps fit other syllogistic figures?

If not, is it possible to derive “novel” structural rules via encoding other figures?

TODO Gathering together

Russel and Whitehead on the juxtaposition of signs effecting a juxtaposition of thoughts.

Structurally, logical operators are a way of linking, drawing together, their operands. \(A \land B\), \(A \lor B \), and \(A \supset B\) each express a way of considering \(A\) and \(B\) gathered together, but in different ways and under different conditions.

What about the gathering of \(A\) and \(\land\)? What conditions this gathering together? As per Martin-Löf, this is the implicit judgment that \(A \: prop\). We seem to need a way of indicating when new space needs to be opened up between juxtaposed terms, to make room for more subtle thoughts. Yet we also need to be able to fold up this complexity to reduce the noise when we’re thinking at a higher level of abstraction, or already understand these binding conditions to be in effect.

Exegesis of Gentzen on the Meaning of his Calculi

In NJ, the definition of the logical symbols that combine formula is given by the inference figures forthe introduction and elimination of the symbol. In LJ, new inferences figures are introduced that do not correspond to logical symbols, but instead to “structural transformations”. What is the meaning of these “structural inference figures”? How do they get introduced?

Gentzen’s driving aim in introducing LJ is to preserve the ability to define the logic symbols by their introduction and elimination rules but to make a deductive calculus which is “logicsitic”. Being “logistic” means each formula that occurs should be a logical truth, and not dependent on external assumptions.

Gentzen’s derivations are trees of formulae or sequents that reflect the “following” relation between terms. In NJ, assumptions are recorded on the leaves of the derivation tree, but they are external to the formula themselves. For example, in the proof that \(A \land B \supset B \land A\)

A & B [1]      A & B [1]
--------- &Er  --------- &El
   B               A
------------------------ &I
        B & A
------------------------ ->E[1]
     A & B ⊃ B & A

The formulae B, A, and B & A are all dependent on assumption [1]. In effect, the intermediary formulae in a NJ derivation do not track their own justification, so we require contextual knowledge about the whole derivation tree to reason about the subproofs that justify their presence.

So, how do we make NJ “logistic”?

The most obvious method of converting an NJ-derivation into a logistic one is this: We replace a [derivation formula] \(B\), which depends on the assumption formula \(A_1, .., A_u\) by the new formula \((A_1 & ... & A_u) \supset B\). This we do with all [derivation formulae]

Recapitulating with an example: we can render our proof of \(A \land B \supset B \land A\) logistic by noting the assumption formula in antecedents of conditionals preceding each of the three dependenc formulae. Since \(B\) depends on assumption [1], we rewrite \(B\) as \(A \land B \supset B\). We do the same with the two remaining dependent formula, and, for consistency, add the trivial self-implication of the assumption to get:

A & B ⊃ A & B A & B ⊃ A & B --------–— &Er --------–— &El A & B ⊃ B A & B ⊃ A ---------------------------–— &I A & B ⊃ B & A ---------------------------–— ->E A & B ⊃ A & B ⊃ B & A

We thus obtain formulae which are already true in themselves, i.e., whose truth is no longer conditional on the truth of certain assumption formulae. This procedure, however, introduces new logical symbols \(\labd\) and \(\supset\), necessitating additional inference figures for \(\labd\) and \(\supset\), and thus upsets the systematic character of our method of introducing and eliminating symbols.

This problematic complication is evident in our attempted rewrite! The principle operators in each formula are now \(\supset\), but (excepting the conclusion), the elimination and introduction rules are all meant to be working on \(\land\). Worse, we’d need another set of introduction and elimination rules for the “new” logical symbols, and prevent ourselves mixing them up with the identical “old” ones.

For this reason, we have introduced the concept of a sequent. Instead of a formula \((A_1 & ... & A_u) \supset B\) we therefore write the sequent

\[ A_1, ..., A_2 \to B. \]

The informal meaning of this sequent is no different from that of the above formula; the expressions differe merely in their formal [syntactic] structure.

I.e., Gentzen introduced an alternate surface syntax, to disambiguate the a structure in the metalanguage which is (informally) synonymous with implication in the object language. But this didn’t actually do away with the problem of needing new introduction and elimination rules:

Even now new inference figures are required that cannot be integrated into our system of introductions and eliminations; but we have the advantage of being able to reserve them special places within our system, since they no longer refer to logical symbols, but merely to the structure of the sequents. We therefore call these ’structural inference figures’, and the others ’operational inference figures’.

When Gentzen says “they no longer refer to logical symbols”, this is true only because he excluded these formal implications from the system, lifting them into the metalogic, by use of Hertz’s notation. Each of the “structural inferences figures” given in 1.21 are valid by virtue of the real logical meaning of the sequents, but this meaning gets obscured. As an example, “thinning”

   A -> C
----------
B, A -> C

is valid just because \(A \supset C \subset (B \land A \supset C)\). The “meaning explanation” for each structural rule is given the same way.

Digression: Reasons to prefer constructive logic
  • Classical logic encourages weak arguments

    If you grant my premises, then one or more of the following conclusions must follow, but I can’t tell you which one.

    Viewed from a rhetorical perspective, sequent calculus is a formal language for reasoning about forms of argument. In the sequent calculus, the only difference between intuitionistic logic and classical logic is whether multiple terms are allowed in the consequent: in intuitionistic logic, all sequents are of the form

    \[ A_1, ..., A_u \to B \]

    I.e., intuitionistic sequents only allow a single term in the consequent. To get classical logic, it suffices to allow sequents of the form

    \[ A_1, ..., A_u \to B_1, ..., B_v \]

    i.e., where multiple terms can appear in the consequent. Using the signs of the object logic, this means

    \[ A_1 \land ... \land A_u \supset B_1 \vee ... \vee B_v \]

    Arguing in classical logic is therefore arguing in a system in which interlocutors are allowed to make claims like: Assuming \(A_1\) through \(A_u\), one or more of these following propositions is true: \(B_1, ..., B_2\). It seems reasonable to lay down a rule that says: when you make an assertion conditional on some assumptions, stick to one conclusion per hypothetical!

  • Classically, not A “implies” A

    \[ \neg A \supset A \]

TODO Reiterated by Girard

The novelty of Gentzen is the introduction of hypothetical deduction as a primitive; besides the implication \(A \Rightarrow B\), there coexists the sequent (\(A \vdash B\): “\(B\) under the hypothesis \(A\)”. One will never insist enough, from a brutal standpoint […], this creation makes no sense; it is a pure duplicate, since the deduction theorem equates the two notions. Sequent calculus makes sense only when one steps beyond mere provability, when one works en finesse.

(Girard 2011, 42)

TODO What is happening here?

TODO Analysis by Shütte (Schutte 1977), 2-3 Higher order reasoning required

“…using induction that goes beyond mathematical induction but with a finite character”

  • TODO Positive and negative parts, polarity, Sommer’s Relational TFL

TODO Truth and Quotation

TODO Dana Scott on semantic assent and encoding implication into the object lanugage
Each “change” in syntax seems to be a “semantic asset”
Truth and Disquotation

This ascent to a linguistic plane of reference is only a momentary retreat from the world, for the utility of the truth predicate is precisely the cancellation of linguistic reference. The truth predicate is a reminder that, despite a technical ascent to talk of sentences, our eye is on the world. This cancellatory force of the truth predicate is explicit in Tarski’s paradigm:

’Snow is white’ is true if and only if snow is white.

Quotation marks make all the difference between talking about words and talking about snow. The quotation is a name of a sentence that contains a name, namely ’snow’, of snow. By calling the sentence true, we call snow white. The truth predicate is a device of disquotation [emphasis mine].

(Quine 1986, 12)

TODO How much of the “ad hoc” machinery in some formalisms could be dispensed with if we could formalize this process, and make it flexible enough to recycle and spin up into semantic asscent at will?

TODO Related Angles

TODO Modality

Roughing in my current thoughts on modality (excerpted from a conversation with Callan McGill):

I think I have a clearer understanding of my worry, which I think also helps clarify the stakes for me, and gives some somewhat clearer criteria by which I can end up convincing myself whether or not I should embrace modalities. I think my hypothesis is that phenomena like time are possibility ideally do not need to be imported as conceptual primitives into our logics. Particularly because the way we encode these phenomena reifies and reinforces post-hoc conceptual framings. E.g., in the case of temporal logic, the idea of time being adequately encoded as a space-like dimension

Looking at how intuitionistic and linear logic works in contrast to modal logic I think is instructive in this regard. If you’ll allow me the gross simplification: Brouwer wanted to be sure we accounted for the phenomena of things which may not be true one way or the other. Which may neither be provable nor disprovable (let’s say, because they are simply inconceivable, and we can’t construct a fact of the mater). But he didn’t bolt on an axiom system or a propositional modality that allows us to state: “this proposition may not be provable”. Rather, the phenomena of unprovability is made immanent in the logic itself. By moving to a “weaker logic” that lets us attend to this difference. (I hope this isn’t an entirely BS account.)

My understanding is the same re:The way to accommodate the phenomena of resources is not to add a “resource theory” into the logic. It’s to learn how to move back to a subtler, “weaker” logic that lets us recognize the way in which the phenomena of resource limitation is already at play. Girard and linear logic vis-a-vis “resources”.

So, given this refined understanding, I think I can prove to myself that modalities are needed, if it turns out that time is not, cannot be, something which is “essentially logical”. That is, that the phenomenon of time is not an inherent and immanent part of what it is to “do logic”.

This seems exceedingly unlikely to me, but we’ll see!

Callan connected this POV with the problems arising from internalizing judgments in type theory:

Me:

iiuc, the relation is somewhat like this? The identity judgment is a fundamental constitutive component of our logic in this case. But if we internalize that judgment, make it an object inside of the logic, then we undermine the structural integrity of the system?

Callan:

Yes, there are two things: the equality judgment - the part that is determined by just unrolling definitions and there is the equality type of the theory which is subtle and the basis of almost all the mathematical aspects of the theory. Internalizing the equality judgment undermines the other equality type to the point the nature is completely changed so somehow the logic becomes ruined

Me:

And so a similar framing in this case would be to say, perhaps if the phenomena time is “really” part of the constitutive structure of a logic, then we undermine the cohesiveness (or the “structural adequacy”, something to unpack later) of logic by putting that phenomena in the logic as object to manipulate.

Callan:

Yes. and it made me think that internalizing external judgments of the theory (e.g. that not ever statement can be decided) ruins the logic This internalization seems to be a case of what Wittgenstein warned of. When we’ve let the unutterable, but essentially constitutive, take shape as questions, they turn up as monstrous puzzles which don’t actually have a solution.

The above is a simplistic view on the relation between modality and linear logic. Girard notes that the exponential are “something like” the modal operators from S4 (he characterizes linear logic as “S4 + structural rules”) in Truth, modality and intersubjectivity.

Interesting to note that it is precisely these modal operators where linear logic loses the security of its footing:

The exponentials $!$ and $?$ in linear logic are less carved in the marble than
the other connectives. Indeed, if one uses traditional sequent calculus
presentations, the exponentials are not “canonical”: if you introduce another
copy of exponentials, say $!′$ and $?′$, with the same rules as the original
ones, there is no way to prove that $!$ is equivalent to $!′$, and $?$ to $?′$,
while for the other connectives this is easily established.

In this respect, the $!$ and $?$ resemble the box and diamond connectives found
in modal logic, and it is then possible and interesting to study variations for
the logical rules of these connectives. For example, elementary linear logic
(ELL) is obtained by replacing the $!$ and $?$ introduction by a single rule
introducing $!$ and $?$ at the same time. As a consequence, ELL can encode all
and only the functions over integers that normalize in time bounded by an
elementary function.

https://stanford.library.sydney.edu.au/archives/spr2014/entries/logic-linear/#DifTreMod

Following Girard, we can attribute the unsteady nature of the exponential’s to the fact they mark the interjection of the subjective into the logic!

This means that !A is subjective, since depending on a viewpoint \(P\).

We eventually discover that the \(!A\) is exactly an affirmation: \(!A\) means that \(A\) is true w.r.t. a certain viewpoint \(P\) ; it should therefore be noted \(!_PA\).

(Girard, “Truth, modality and intersubjectivity”)

In (Baelde 2012), they replace the modal operators with least and greatest fixed points, to provide induction and coinduction on terms, as an alternative means of enabling reasoning about “unbounded (infinite) behavior”

The first-order theory of MALL (multiplicative, additive linear logic) over only equalities is an interesting but weak logic since it cannot capture unbounded (infinite) behavior. Instead of accounting for unbounded behavior via the addition of the exponentials (\(!\) and \(?\)), we add least and greatest fixed point operators. The resulting logic, which we call \({\mu}MALL^=\), satisfies two fundamental proof theoretic properties. In particular \({\mu}MALL^=\), satisfies cut-elimination, which implies consistency, and has a complete focused proof system. The second result about focused proofs provides a strong normal form for cut-free proof structures that can be used, for example, to help proof search.

Time consciousness (phenomenologically speaking) has at least the following two aspects:

directed

One of the most marked features about the law of the mind is that it makes time to have a definite direction of flow from past to future. The relation of past to future is, in reference to the law of mind, different from the relation of future to past. This makes one of the great contrasts between the law of mind and the law of physical force, where there is no more distinction between the two oposite direction in time than between moving northward and moving southward.

(Peirce 1992)

recurrent

If life is not always poetical, it is at least metrical. Periodicity rules over the mental experience of man, according to the path of the orbit of his thoughts. Distances are not gauged, ellipses not measured, velocities not ascertained, times not known. Nevertheless, the recurrence is sure. What the mind suffered last week, or last year, it does not suffer now; but it will suffer again next week or next year. Happiness is not a matter of events; it depends upon the tides of the mind.

(Meynell 1896)

Girard has indicated that he connects the directedness with non-commutative operations:

Time occurs when we cannot permute two rules, since one must be performed before the other, for fear of a procedural catastrophe. This is therefore the alternation positive/negative, answer/question, explicit/implicit.

(Girard 2011)

It is sensible to me that the right place to find the recurrence (and, thereby the metricality) might be in fixedpoints, that allow structural induction, rather than the admission of operators that let us posit permanence and inexhaustability by fiat.

TODO Whither?

Linear logic is a refinement of classical logic and intuitionistic logic. Instead of emphasizing truth, as in classical logic, or proof, as in intuitionistic logic, linear logic emphasizes the role of formulas as resources.

(SEP)

This suggests that we may discover and evolve as many logics as there are structural aspects of the relation between being and thought that can bear emphasizing and are essentially susceptible to formal articulation (this last criterion is essential, and should guard against “mustard watches”).

One horizon for exploration, then, is the discovery and articulation of additional aspects. Another – at a higher dimension – is the exploration of what makes possible such “aspects”, what make emphasizing them possible and interesting. When is an aspect adequate to reveal an interesting logic, and how can we determine if it is susceptible to formal articulation?

Cf. the theory of institutions and the project of universal logic, and the project of transcendental techniques.

Links

References

Baelde, David. 2012. “Least and Greatest Fixed Points in Linear Logic.” Acm Transactions on Computational Logic 13 (1): 1–44. https://doi.org/10.1145/2071368.2071370.
Girard, Jean-Yves. 2011. The Blind Spot : Lectures on Logic. European Mathematical Society. http://www.worldcat.org/oclc/757486610?referer=xid.
Goguen, Joseph. 1998. “An Introduction to Algebraic Semiotics, with Application to User Interface Design.” In International Workshop on Computation for Metaphors, Analogy, and Agents, 242–91. Springer.
Hart, W. D. 2010. The Evolution of Logic. Cambridge New York: Cambridge University Press.
Hertz, Paul, and Javier Legris. 2012. “On Axiomatic Systems for Arbitrary Systems of Sentences.” In Universal Logic: An Anthology, 11–29. Universal Logic: An Anthology. Springer Science + Business Media. https://doi.org/10.1007/978-3-0346-0145-0_2.
Legris, Javier. 2012. “Paul Hertz and the Origins of Structural Reasoning.” J.-Y. Béziau, Universal Logic: An Anthology. From Paul Hertz to Dov Gabbay, 3–10.
Meynell, A. 1896. The Rhythm of Life and Other Essays. https://books.google.ca/books?id=xKbugkcaLosC.
Peirce, Charles. 1992. The Essential Peirce : Selected Philosophical Writings. Bloomington: Indiana University Press.
Quine, W. V. 1986. Philosophy of Logic. Cambridge, Mass: Harvard University Press.
Schroeder-Heister, Peter. 2002. “Resolution and the Origins of Structural Reasoning: Early Proof-Theoretic Ideas of Hertz and Gentzen.” Bulletin of Symbolic Logic 8 (2): 246–65.
Schutte, K. 1977. Proof Theory. Grundlehren Der Mathematischen Wissenschaften, 225 : a Series of Comprehensive Studies in Mathematics. Springer-Verlag. http://gen.lib.rus.ec/book/index.php?md5=a337a65ca01284a93ab578f28d8d7bec.
Whitehead, Alfred North, and Russell Bertrand. 2005. Principia Mathematica. Ann Arbor, Michigan: University of Michigan Library. http://name.umdl.umich.edu/AAT3201.0001.001.

Footnotes:

1

Please pardon my inexpert translation.