Figure 1,left illustrates the basic ideas of the ''weak syntactic paradigm'', and we refer the reader to the first two reports for a fuller treatment of the argument. In the triangle the three vertices represent the linguistic, mental and real worlds. All these worlds have an inner functional structure, by which more complex elements are built by simpler ones, until atomic elements are reached. The correspondences between the worlds preserve these functional structures. In more scientific environments the languages are formal and the elements of the linguistic world are symbols and formulas, while the mental world is often made up by the mental concepts and models representing the reality. The links between the linguistic worlds and the other two is given by the 'meaning' relation, characterised in figure as ''intensional'' and ''extensional'' semantic. In the first two reports we underlined the solid link between natural language and mathematical concepts in Greek culture: the examples were the logic terminology and the splitting of the semantic field of eimi (to be), the difficulty in dealing with the infinite and the absence of the zero connected with the negatives, the opposition between one and number connected with the singular/plural distinction, the idea of equation connected with the evolution of the word isos (equal). The process of building a new almost autonomous philosophic-logic-mathematics lexicon began with Aristotle and found in the Renaissance its extension to include also a physical-mechanical lexicon.
It is noteworthy that modern science in XVII century succeeded in fitting also the natural change in the scheme (fig.1 right), as something that can be formally expressed and analysed. Wundt, quoted in [CASSIRER, 1906-50 #246], underlined as this was necessarily bound to the philosophy of mechanism, for the movement was the only change in which it was possible to preserve the identity of the changing object. The ancient concept of ''substance'' progressively disappeared in Physics, but the 'fixed being' Parmenidean requirement can be revealed also in the new science in the role played there by the ''conservation principles''. In Renaissance the strongest form of the ''syntactic paradigm'' was the Cartesian pair ''clear and distinct ideas'' and ''geometric extension''. The 'being' was in the 'space' and then the 'change' was substantially the 'movement'. In this framework the new physics was able to extend the realm of being to include also the change as movement. This process was framed in the Galilean mathematical foundation of physics. It is useful to remember that, till Scholastic philosophy, physics and mathematics were thoroughly distinct. Among the natural sciences, there were only astronomy and music (scientiae mediae in Saint Thomas, XIII century) to show a connection with mathematics. That achievement broke the ancient link between natural change and logic contradiction, normally unified in Greek philosophy under the common headline of not-being. Thus, the contradiction became only a formal phenomenon. Weyl claimed in his Philosophie der Mathematik und Wissenschaft ([WEYL, 1926 #221], 37-38) that the concept of ''limit'', crucial for the theoretical establishment of the new connection between physics and mathematics achieved by modern analysis, is grounded on a ''static'' behavior toward the becoming, made possible only by the employment of the actual infinite idea. The succession of the partial sums in a series or the Dedekind section to define a real number are not generic actually infinite definitions, but potentially infinite constructions, which, only when entailed by a well defined law, can be employed to identify a well defined mathematical entity as result of an actually infinite set. A trace of the inclusion of changing, of the process, in being is also given in the fact that the logical rules of inference, the basic engine for the theorem proving, are derived from tautologies, basic analytical unchanging truth of logic. The core of the new version of the paradigm is in the idea of natural law. Weyl[WEYL, 1926 #221] underlined as the law is Leibniz form of the most characteristic point of any natural philosophy, as the substances for Aristotle or the ideas for Plato: the point where Heraclitus flux stiffens.
For the role played by the new coded law in Greek civilization in developing the idea of human and natural rule, see [GERNET, 1973 #37][MONDOLFO, 1982 #80][SERRES, 1993 #244]. Maybe the sharpest ancient expression of the lawfulness of nature and of the correlated one in the human mind, can be found in Plato's Timeus, 47, where the sight is considered the tool by which, observing heaven's orbits and rules, we built the 'number' and the 'time' and gave an order to the unstable and uncorrected 'orbits' and 'rules' in our mind. We showed in the first report that also the idea of 'equality' developped in Greek society in tight connection with the idea of 'justice': in Republica,359 c6, Plato explicitly characterizes the law as the constraint to respect the equality. For the connection between alphabetic revolution, law codification and rational thought, see [HAVELOCK, 1978 #43]. In addition it is not useless to point out that Frege named its new logic formalism a Begriffschrift (concept writing) and not Begriffsprache (concept language)!
It is worth observing that in China, maybe somehow linked to the Confucian refusal of any written codification, the idea of God as a lawgiver imposing ordinances on non-human Nature (II,581) is thoroughly absent. In the words of Chheng Hao: The laws of Heaven are wordless but they keep faith; divine law has majesty untinged with wrath. (II, 564)[NEEDHAM, 1954- #179]. Needham claims that this could, at least in part, accounts for the lack of a modern scientific revolution in China, despite its great culture and technology. In addition, we underlined in the first reports the connection between the absence of the 'syntactic paradigm' and the lack of an alphabetic writing revolution in China. We do not try to display here the detailed changes between the Aristotelian and the modern ''weak syntactic paradigm'', but it is useful to underline some other changes: the deductive and causal Aristotelian structure of the 'mental world' is somehow 'weakened' in the modeling role it plays in the modern paradigm. And this is not the unique role the ''self'' has to play: he appears also engaged in the new (thoroughly absent in Aristotle) role of observing and measuring subject in the quantitative reduction of the whole scientific enterprise.
The crucial step from the Aristotelian to the modern weak form of the ''syntactic paradigm'' is connected to the name of Leibniz [LEIBNIZ, 1875-1890 #241]. Here we find clearly stated the modern form of the relation between existence in the formal world (non contradiction as de jure truth) and in the real one (generative as de facto truth):
Real definition is that by which the defined object is possible and does not imply contradiction ... Then causal definitions which contain the generation of the thing, are also real; and we do not think ideas of things, but for insight of their possibility (nisi quatenus earum possibilitatem intuemur) ([LEIBNIZ, 1875-1890 #241],VII, 310)
However, in Leibniz the truth is first of all rational and never linked to some kind of correspondence with the experience; hence in the above quotation the 'generation' has got a sharp 'mental' flavor, foreshadowing an intuitionistic approach. The quoted pair of theoretical truths is also mirrored in the two general truths of his metaphysics: contradiction principle and reason principle (i.e. nothing happens without a reason). The knowledge for 'direct insight' is only for God, and, for the man, the only access to the knowledge is the symbolic thought. This way in Leibniz the 'sign' world, that in the Aristotelian paradigm was only linguistic or mathematics (with the logic belonging to the 'mental' world), becomes a symbolic logic world, centered on a true ''thought alphabet'', which assumes a sort of mathematical structure: it becomes a calculus. In this framework the compositional structure of the worlds does not include only the elements, but also the form of their connections, paving the road to the above mentioned functional inclusion in the paradigm of the becoming ('movement' in the new physics), left out in the Greek approach. This is also the ground of Leibniz' contribution to the birth of ''differential calculus'', for this inclusion of becoming is the way to reduce the movement and the infinitely small to the 'fixed being' of the Parmenidean gnoseologic requirement.
Our century has seen a quick and radical change in the paradigm, centered on the vanishing of the autonomous role of the mental world. This new version, the ''strong syntactic paradigm'', diffused in logic and mathematics, and in theoretical physics and computer science as well, translates the 'meaning' as the coding relation between two worlds, where the formal or syntactic world becomes a ''meta-system'' with respect to the real or semantic world, often called the ''object-system''(fig.2). In the first report we analyzed this passage most of all as it occurred in contemporary physics. However, it appeared probably also stronger in mathematics. Maybe the non-Euclidean geometries were (not the cause, but) the first sign of a change characterized by the crisis of the geometrical insight, as established for two thousand years, from the Euclidean evidence to the Kantian synthetic a-priori judgments. Before Kant, evidence, objectivity, truth, non-contradiction had coincided, whereas in Kant the rupture between objectivity and legislative role of mind began that distinction that in Gauss became the opposition between geometry, whose objective character was a defect from the pure reason point of view, and arithmetic, where he found the autonomous legislative power of reason and the real mathematical rigor. The geometrical insight had been for centuries the fundamental sense-presupposition for the new physics and infinitesimal analysis, and the basic template for the mathematics rigor.As a consequence of that discovery, the very idea of ''axioms system'' began to change: no longer the set of the evident or necessary assumptions, but a simple set of hypotheses, no longer the formal translation of a true (mechanic or common sense) model of reality, but the whole 'model', available for any application related to a correct interpretation. The ''model'' in classic physics was a network of concepts and relations, representing the reality, warranting the adaequatio intellectus et rei, and in turn simply substituted, for the sake of computing, by symbols. Now the representation is given only formally, and the model is an interpretation of a world of signs. The knowledge of the being is no longer warranted by the ideas, but only by the signs manipulation and its technological effectiveness. The ''book of nature'' that for Galilee and Descartes was written in geometrical characters, must now be written in arithmetical and logic-algebraic characters. The ''real'' number, whose representation, since the ''incommensurability'' discovery, was only geometric, gets a new definition as ''section of the rational numbers set''. Roughly speaking, in the realm of syntax of the ''strong syntactic paradigm'', the knowledge is a play only between being and signs.
The classic distinction between ''(existing) individuals'' and ''(true) facts'', in the semantic system, or between ''(definable) terms'' and ''(provable) propositions'', in the syntactic system, and the epistemic priority of the individuals with respect to facts, common in the weak syntactic paradigm, are thoroughly changed in the ''strong syntactic paradigm''. Already in Aristotle, as shown in the first report, the worlds of the paradigm, most of all the formal one, show an inner functional architecture, according to which complex elements are built up from simpler ones, recursively until a finite set of primitive elements. By and large, the first concepts (point, number, and so on) must be evident, i.e. doubtlessly recognizable in the real or abstract world. Conversely, in the modern axiomatic approach, for example in Hilbert and also in Russell, we try to define the primitive elements in the syntactic system without any reference to the object-system, only by using ''implicit definitions''. That is, the ''meaning'' of a primitive elements is given by the set of formal axiomatic relations in which it appears . The first classic examples of this approach were the 'point' or 'line' concepts in the axiomatic geometry or the axiomatic foundation of arithmetic, at the end of the XIX century, where there was no general meaning of the primitive concepts beyond what implicitly given by the axioms. This general or formal meaning can then be furtherly 'specialized' to some intuitive entity in the ''object-system'' by a given specific coding (''interpretation'' or ''model''). A better understanding of this point can be reached recalling something said in the first reports: the Aristotelian foundation lets the meaning appear only beyond a given threshold of complexity, thus to overcome Platos Theaetetus paradox of understanding complex structures without understanding their elements. After the breakdown of the autonomous mental world, the elements can no longer be understood by clear and evident ideas. This way, it is not strange that the compositionality of the syntactic paradigm needs to find a new ground in the signs world. The implicit definition of concepts by a set of axioms plays such role: the primitive concept is under the threshold of the meaning, and then can only be implicitly defined by more complex expressions, the axioms, and explicitly defined only by an interpretation.
The point in the strong paradigm, both in modern logic and quantum mechanics, is thus that the crucial role of syntax tends to reduce the very idea of object to its formal counterpart. The necessity of defining first of all the 'individuals' often paved the road to the 'existential import' paradoxes, to which we referred in the first report. Thus, to avoid this, in Russell-Quine theory individuals were defined as ''singular descriptions'' and so defined by 'facts':
...a theory is committed to those and only those entities to which the bound variables of the theory must be capable of referring in order that the affirmative made in the theory be true.(Quine, On what there is [QUINE, 1953 #142])
thus accomplishing the reduction of singular descriptions (proper names) to ''descriptive phrases''. The sentence 'The present king of France is bald', to avoid the implicitly asserted existence of a present king in France, could be translated as 'There is an x such that x is the present king of France and whoever is king of France is bald'. The inexistence of a present king in France entails that the sentence is simply false.
Wittgenstein's Tractatus shows the same idea: 1.1 The world is the totality of facts, not of things. [WITTGENSTEIN, 1961 #124]
Thus, in the strong paradigm, there is a sharp and diffused tendency toward the reduction of 'individuals' to 'facts'. As a consequence, the paradigm, before the interpretation phase, reduces 'equality' to 'isomorphism', and 'truth' to 'consistency'. The earliest version of this idea is probably Leibniz' ''principle of the identities of indiscernibles'', which stated that two entities having the same properties are also equal: in logic form
( for any f. f(x) <==> f(y))==> x=y
In Quantum Mechanics it appears as the ''Pauli exclusion principle'', according to which no two electrons can have the same quantum state, and then the set of their quantum numbers is sufficient to characterize them in the atom. This principle could however be read as a purely nominalistic one. Its ontological nature, on the contrary, can be understood by observing that the indistinguishability relation is crucial for defining the set of possible values and hence the probability of different states. Thus the indistinguishability criterion defines the probability distributions of the quantum states of the elementary particles (Fermi-Dirac and Bose-Einstein statistics). By reminding how deeply physical is the probabilistic behavior in quantum mechanics, it is easy to realize that the reduction of the objects to facts (values of physical properties) is unavoidable, and the very idea of somehow denoting distinct pieces of reality is completely out of any possible quantum mechanical interpretation.
Which kind of relation is there between language and reality? We can suppose that the coding/decoding relationship is ''unambiguous'' in both directions (i.e. an injective function). There can be a ''surjectivity'' problem between the two domains. That is: is any semantic entity codified? And, is any formal entity decodifiable? In the first rough 'linguistic' version of the ''syntactic paradigm'' the reality facts were a subset of the sets of the 'linguistically expressible' and 'mentally conceivable' facts, and this was, as claimed in the first report, one of the roots of the ''negative judgment paradox''. Since Gorgias, the 'foundational' goal has been to rule out this paradox and to attempt to define a one-to-one relation. To this aim, among the entities syntactically definable in the two systems, there are some which can be distinguished: in general we will call them the ''valid'' entities, or more specifically ''true'' (in the ''object-system'') and ''provable'' (in the ''meta-system''). Normally, the former are somehow given, the latter have to be produced by syntactical rules of the formal system. Then, the coding relation from the ''object-system'' to the ''meta-system'' can be such that any ''provable'' (or ''definable'') entity is decoded as a ''true'' (or ''existing'') entity (''correctness'' or soundness of the coding), or such that any ''true'' entity is coded as a ''provable'' entity (''completeness'' of the coding).
A further subtle change in the paradigm is in its very nature. Knowledge is no more a sort of, more or less developed, 'feature' of any single man, built up of single facts (Platonic birds or sentences), and the mind is no more a sort of store (Platonic bird-cage or block of wax). Knowledge does gain the aspect of a 'theory' or a 'set of theories', i.e. sort of system to be asked by questions, whose answers give the set of valid items. This change is linked to the increasing role of the subject, from the third person ontology of Greek philosophy to modern philosophy. We have to stress the role of this 'asking' or questioning function (vertical lines in fig.2), and its working in modern axiomatic theories is revealed by the distinction between a 'static' set of facts (axioms) and a 'dynamical' principle of reasoning (rules of inference) . And the role of the 'subject' will be redefined in this 'asking' function, instead of in the 'mental world'. Thus, in the modern versions of the paradigm we will have to point out this new setting, and it will play a major role in our interpretation of the antinomical arguments. The characterization of the asking process in the syntactic paradigm is linked to the commutativity of the diagram in fig.2. That is, given a set of elements in the meta-system, the decoding of the element we prove from them can be obtained as true consequence from the set of elements in the object system which code the original set.
This is the general scheme. Roughly speaking, we could depict the modern scientific paradigm as a chain of worlds beginning with the reality world, passing through some intermediate links as physics, differential calculus, arithmetic, and ending with a formal world as logic or elementary number theory or some sort of language of thought or mentalese. This means today to accomplish the aim of writing the book of nature in mathematical characters. However, more specific paradigms for specific fragments of this chain can be developed starting from the general scheme. The scheme for quantum mechanics will be drawn in the following (fig.5). The scheme for Hilberts metamathematics is shown in fig.3: there is an informal mathematical theory (arithmetic, set theory, geometry, etc.), its formalization (Peanos arithmetic, Zermelos set theory, Hilberts geometry, etc.) and the meta-theory which analyses the formal theory with finitary methods (informal finitary arithmetic). Metamathematics, as we shall see in the following, plays somehow the residual role of the 'mental' world, but in the same time coincides with the informal arithmetic, the core of the sign world.
In the first report, to analyze the beginning of the Platonic and Aristotelian foundation, we underlined the crucial role of the ''negative judgment paradox'': <given that an affirmative statement corresponds to a fact in the world, something that is, we have that a negative statement corresponds to something which is not, but a statement about what is not, is about nothing and hence is impossible>. And we stressed that the ancient classic Greek philosophy did not give to the paradox a 'strict solution', but merely a 'removal' in a complex new foundation, overwhelming the whole mythological and presocratic culture as well. In the paradox we distinguished two main aspects, and in the Aristotelian foundation two different 'solutions'.
The first aspect was the ''falsity problem'': what does correspond to a false statement, as in the sentence ''Theaetetus flies''? Corresponding to a true statement there is something which is, corresponding to a false statement there is something which is not, that is nothing. The solution to this aspect was built in the syntactic idea of sentence as including different parts, most of all a 'name' and a 'verb', and in the functional role of truth as an operator acting on the parts of the sentence until its resulting as ''true'' or ''false'' in the last components (an 'agent' and an 'action') according respectively to the 'real' blending or non-blending of the Forms evoked by the sentence. This could be defined the functional idea of truth, and can be revealed by the extensional and recursive definition of truth in the modern Tarskian semantics of predicate calculus.
The second aspect was the lie problem: considering that 'speaking' and 'thinking' had in the presocratic philosophy the same 'subjectivity status' than 'seeing' and 'hearing', to say what is not was as impossible as to see what is not. The solution had to be found in the break between physical and psychological levels which began with Aristotle and was later developed in the Christian ideas of soul and free will. In this role, that we could call assertive, the truth becomes the act of asserting a sentence (the subjective acts of 'saying'), relating them to corresponding objective 'facts'. In the modern Tarskian semantics it can be found in the correspondence between language and metalanguage: '' 'p' is true if and only if p''.
We underline that in both cases a key role is played by the mental world. In the first aspect, it is the place of the blending or non-blending of the Forms. In the second aspect, it is the psychological level and the dwelling of the soul. Hence, if the structure of the paradox is strongly connected with the unstructured semantic field of eimi and negatives, and with the lack of distinction between perception and thinking, its removal is linked to the syntactic paradigm: it appeared from the coincidence between language and being, disappeared with the creation of the new mental world, at the same time cradle of the Self and place of the ideas.
We can get a deeper understanding of the paradox observing that in the ''falsity problem'' we deal with the expression of false sentences, i.e. we can assert non-existing facts and deny existing ones. In our terminology we must cope with an incorrectness of the coding for false sentences, and, when dealing with negations which cannot be positively expressed (a classic example of positively expressed negation is ''not even = odd''), also with the lack of a 'mental image' corresponding to the sentence. The corresponding idea of truth in the Aristotelian foundation is functional, i.e. linked to the formal manipulation of non-empirical terms, as the negation, till monitoring the final blending for atomic sentences, for it is necessary to stress a structure-preserving correspondence between the worlds. In the ''lie problem'' we deal with the role of the subject in the strict inclusion of the real world in the language world, for we can assert non-existing facts. In our terminology, we have a subjectively uttered incorrectness of the coding for negative sentences. In this case there can be a 'subjective mental image' of the fact, but it lacks the correspondence with the 'real world'. In the Aristotelian foundation the corresponding idea of truth is assertive, i.e. is based on the subjective assertion of something real, stressing the surjectivity of the relationship.
The Aristotelian foundation implied also a complex rebuilding of the mathematical sciences, beyond the old Babylonian and Pythagorean framework and the Zenonian paradoxes. Its greatest achievements were the syntactic paradigm, analyzed in the first report, the syntactic proof theory and the new concepts of infinite, analyzed in the second report. The formal reconstruction of some crucial fragments of the natural language was crucial in that foundation: - The beginning of the splitting of the semantic field of being. The copula usage for both predication and identity began to get a specific operational treatment, the former in Aristotelian logic, the latter in Diophantine algebra. - The existential usage of being stood out from the other aspects of the same semantic field. It was employed as a specific aspect in the postulates of axiomatic mathematics, translating the earlier more empirical idea of construction. - The being as truth, which expressed the part of the truth linked only with the judgment and not with the substance and had to play an ever increasing role in science, became simply removable and coincident with the simple assertion in the axiomatic method. More precisely, the assertion became the intersubjective counterpart of the objective being as truth. This kind of predication shifted the assertions from the common sense terms to their scientific and empirical counterparts. - The negative appeared fully commutable with the being as truth. With respect to the existential being, a complex operational parallel linked, in the Aristotelian truth as correspondence, truth, negation, existence and assertion: <true = asserting what exists or denying what does not exist, and false = asserting what does not exist and denying what exists>. In modern mathematical logic the negative becomes fully commutable also with the copula, and non-commutativity of negatives remains only for the other two Aristotelian examples: knowledge verbs (epistemic logic) and necessary/possible constructs (modal logic).
As pointed out by Wittgenstein ([WITTGENSTEIN, 1974 #165], I.14), the property that ''when is doubled it yields an affirmation'' is not a ''further description'', but ''it constitutes negation. The 'formal constitution' of negation is embedded indeed most of all in the ''non-contradiction'' and ''third-excluded'' principles. The latter is strongly connected with the elimination of double negation, as underlined in the intuitionistic approach. We have to remind what underlined in the previous reports: there are general concepts (being, truth, negation, etc.) which have no empirical content, that is, which have no extensional semantic in themselves and cannot be anyway derived from the experience, but are nevertheless the necessary (and paradoxical) framework of the syntactic reconstruction of the world. Really, they haven't any counterpart in the Freudian ''Unconscious'' either, as revealed in the dreams. Truth and being are embedded in the ''reality principle''[FREUD, 1920 #177], while, for the absence of ''negation'' in the Unconscious, see Freud[FREUD, 1925 #35]. These points will be the core of the rational reconstruction of the paradoxical architecture of the modern science we shall develop at the end of this report.