Stoic Physics, Gödel, and Quantum Mechanics

In this article I briefly present the case that Stoic natural philosophy provides the missing meta-language to finally make sense of quantum mechanics. Modern Stoic writers such as Lawrence C Becker dismiss Stoic physics as an embarrassment. Becker tries to cobble together a hybrid of Stoic ethics with what is essentially a modern version of Epicurean physics. The result is an Epicurean Stoicism, an oxymoron if ever there was one. To Chrysippus, the very core of ethics arises from the fundamental physical principles of matter. If Stoic physics is as Becker describes it merely a “flippant speculation about physical processes,” it is hard to see that Stoic ethics could possibly escape the same epitaph.

In Hellenistic times, the two dominant philosophical schools of thought were the Epicureans and the Stoics. Of course, there were also the Sceptics who sat on the fence advocating suspension of judgment as Sceptics do. Leaving aside the fence sitters, my interest is in the two opposing camps with the Epicureans on one side of the fence and the Stoics on the other. Of the two camps, the easiest to understand is the Epicurean philosophy. The reason is that, in so many ways, the Epicurean worldview corresponds pretty much to that of present day, modern science. Epicurean doctrine differs from modern science in that, like practically all natural philosophy in antiquity, it was non-empirical. Add empirical methodology, the associated quantification, and one ends up grosso modo with scientific methodology resembling modern physics. Charles Sanders Peirce picked up on this when he wrote that the philosophy of John Stuart Mills corresponded almost exactly with that of the Epicureans.

Epicureanism is materialist, determinist, and above all, fundamentally atomist. Epicureanism studies the reality “out there,” a reality that is assumed mind independent and behaving in a completely deterministic way – well almost in a completely deterministic way. Unbridled determinism leaves no place for free will and that poses a problem. To leave some slack for free will, Epicure added a fresh ingredient, He added an escape clause to his atomist, deterministic equation. Certainly, reality could be, in the limit, totally explained by the deterministic motion of the atoms making up the material universe. However, this motion was not totally deterministic. Apparently, every now and then an atom exhibits an imperceptible random “swerve.” If this were not the case, the universe would never have evolved beyond its point of departure. Instead, as a gross accumulation of random Epicurean Swerves, the universe nano-swerved into the state that we see it in today. And there you have it. With a bit of creative elaboration, this worldview could also even embrace Darwin’s theory of evolution. After matter nano-swerves to a certain state of affairs, matter starts micro mutating in such a way as to produce organic compounds, elementary life forms, amoeba, monkeys, and eventually us. We are all the end result of trillions of Epicurean swerves.

Classical nineteenth century physics has no need for the Epicurean Swerve as it saw a completely deterministic form of atomism, much like that of Leucippus and Democritus who preceded Epicure. But modern physics is not classical it is quantum. Unlike classical physics, quantum mechanics has been developed in order to explain its own version of the non-deterministic Epicurean Swerve. According to quantum mechanics, reality “out here” is not deterministic but permeated with its own versions of the Epicurean Swerve, The observed non-deterministic behaviours of nature at the quantum level are sometimes referred to as the “quantum mysteries” or even as examples of quantum “weirdness.” Specifically, this includes the questions of entanglement, Heisenberg’s uncertainty principle, the collapse of the wave function, the mysteries of the two-slot experiment, and Einstein’s comment regarding “spooky action at a distance.”

From Dirac Razor to Stoic Razor

With the advent of quantum mechanics, the chief casualty of classical physics is the concept of the mind independent reality. The isolation of the objective world of objects from the subjective world of the subject is unachievable in practice. Somehow, the lot of object and subject are intimately entwined and interdependent on each other. The most frank admission of the new reality comes from the Copenhagen interpretation of Quantum Mechanics. Expressed by Dirac, often referred to as the Dirac Razor. The Razor sates that the new physics was basically a formal scheme limited to the prediction of experimental results. Anything to say about ontological or other philosophical questions was strictly outside the realm of physics.

In other words, one should avoid sounding silly by claiming that something exists in “the world out.” Instead, the only objective knowledge about what exists is that which exists concurrent with the instant of measurement. This can be thought of as moment when object and subject are both present. Both share the same “now”, so to speak. It is in this idealised instant that we start to glimpse the need for a change of paradigm. Dirac’s Razor declares that the new physics requires a dramatic paradigm shift from classical physics. However, the new physics does not explain the new paradigm. The new physics simply cries out “Shut up and calculate.” Instead of the new physics leading to a clearer and more insightful insight into the nature of reality, it provides the opposite – a mindless, numbing mania of number, measurement and obscure abstraction. The curious public demand more, they demand an explanation.

The missing paradigm can be found by breaking away from the Epicurean style realism and changing philosophical camp to that of the Stoics. The Stoics can be said to have their own version of Dirac’s Razor and the primacy of the moment. The Stoic version states:

Entities in the past or the future do not objectively exist. The only entities that exist are those immediately present with the subject.

The Stoics exploited this principle in their ethics, teaching not to fear anything in the past or the future as such things do not objectively exist and so cannot exercise any powers on the present. The Stoics thus become heroes of the present, mastering the integrity of now.

The same principle underpinned their physics. To the Stoics, entities had to be material and corporeal capable of acting and being acted upon by other material corporeal entities. Of course, all such acting and acting upon only occurs in the present. Implicitly or explicitly present in the present must be the subject. Thus, the nowness involved in the Stoic Razor is that of the subject, the subject in question. This means that the principle must apply to the universe we live in, the universe as subject bathing in its nowness. Moreover, the same principle must also apply to any other subject such as living organisms and of course to human beings, be they slaves or freemen, man, woman, or child. Whether animate or inanimate, all creatures become heroes of their own present, an individual presence that is distinct but harmonious with that of Nature.

I interpret and express the underlying, universal principle of Stoicism in this form, as the Stoic Razor. The Razor is synonymous with the principle of life. The physics and logic of life must be based on this universal principle. It is important to note that present day computer controlled robotic systems violate this principle. The “present” or “nowness” owned by a robot is dominated by pre-programmed instructions. Instead of being a hero of the present and making its own way in the world, the robot is a slave to its past.

The Stoic Razor principle dictates all entities that objectively exist. The principle applies to all organisms, be they animates such as biological life forms, or inanimates like the universe we live in. All such organisms are dictated by this draconian condition. But here is the catch. Here is the rub. The condition is so draconian that it must apply to itself. The principle demands that an organism obey the principle at all costs then, in the same breath, it demands that the same organism must refuse to be dictated by any principle whatsoever outside its immediate presence. What this means is that the principle is non-programmable.

Gödel versus the Stoics

We see here the flip side to what could be called the Gödel Razor. The formalisation of the pre-programmed robot or Turing Machine is in the form of an axiomatic mathematical system. To be non-trivial, the axioms must include those of elementary arithmetic. For any such non-trivial axiomatic system A, Gödel’s first Incompleteness Theorem applies. The incompleteness theorem comes up with its own version of a principle G applied negatively to itself. Expressed as the proposition G:

G: The proposition G cannot be proved.

Now if G can be proven from the axioms A, the mathematical system must be inconsistent as G says that G cannot be proven. On the other hand, if we assume that the system is consistent then there must exist propositions in A which are valid but cannot be proven. Thus, Gödel’s incompleteness theorem effectively states that any consistent axiomatic mathematical system A can be cut down the middle into two sides with what I will call the Gödel Razor. On one side of the razor, the left side say, will be all of theorems provable from A. On the right side of razor is the murky side of the system made up of all the rest of the well-formed formulas of A some of which will be logically valid but unprovable from A. Gödel’s incompleteness theorem says that these valid but unprovable propositions must always exist no matter what. Many consider that Gödel’s incompleteness theorem to be the most important of the twentieth century.

Mathematics is interested in all of the propositions on the left side of the Gödel Razor These are all the provable theorems of the axiomatic system A. In principle, it is possible to program a Turing Machine to mechanically enumerate all of the theorems on the left side of the Gödel Razor. However, on the right side there are also some propositions that are valid. These are valid theorems of the system but are unprovable. Mathematicians may be interested in these unprovable theorems but Gödel has proven them to be out of bounds to the traditional paradigm of mathematics.

This is the point where the ancient Stoics can step in. We, as Stoics, can take a fresh look at proposition G above. We say that a proposition is true if it can be proven from the axioms A. The Stoics refer to this as the contingently true, in this case contingent on the axioms A. The Stoics made a clear distinction between the contingently true and the truth. The contingently true was considered incorporeal. That is fair enough. There is nothing more abstract and incorporeal than being contingent on a set of abstract axioms A. For the Stoics, just as the true was incorporeal, the truth was corporeal. This might seem quite odd. How can a truth be corporeal? What is a good example of a corporeal truth?

The best example I can come up with is none other than Gödel’s central proposition G where we add the proviso that G must be a corporeal truth! Following the Stoics, to be a corporeal truth G must involve material corporeal bodies acting on and being acted upon and all of this taking place in the present. Here we have left the world of abstract mathematics and have entered the world of an organism pushing and shoving, toing and froing, in such a way as to maintain the veracity and hence truth of a fundamental proposition, notably the proposition G. Somehow it seem that for this organism, assuring the truth of G is so important that its life depended upon it. The bodies immediately associated with or owned by the organism must act and be acted upon in such a way that the proposition G is valid. Imagine that this organism is fighting for life. The organism’s prime purpose in life is to assure that the proposition G corresponds to the truth. The organism will do anything within its physical powers for this to be the case. These are desperate times. Moreover, there seems to be no let up. It seems that this preoccupation will endure throughout its life right up to the day it dies. From cradle to the grave, for this organism G is and must be maintained as a fundamental truth. This is a self-justifying truth and hopefully for the organism, it’s going to work.

Now it is time to read the fine print. What does this organism-backed proposition G actually say? G states that G cannot be proven. Now G might conceivably contain some more fine print. This is of no concern to us but might be of some concern to the organism fighting for its particular mode of life. Extra specificity in the proposition G is permissible as long as it is free from any entanglement with nefarious activity outside the organism’s precious nowness.

In Concluding

I hope that I have written graphically enough to convey the central message. The epistemological foundation of present day science and its realist mind independent view of the world is Epicurean in Nature. Quantum Mechanics with its associated quantum mysteries and weirdness throws a spanner in the works. Stoic natural philosophy based on its logic, physics and even its ethics provides a way out of the conundrum. I sketch out how the Stoic paradigm is diametrically opposed to that of axiomatic mathematics. Everything provable in an axiomatic mathematical system can be enumerated by a Turing machine type computer. But Gödel showed that certain truths are out of bounds of formal mathematics. However, I claim that they are not out of bounds to another kind of formalism — that implicit in Stoic natural philosophy. Truths on the out of bounds side of Gödel’s Razor become accessible from within my interpretation of the Stoic paradigm.

The simple message I am trying to convey in this article is that the principle of Stoicism involves a universal life principle that underlies the organisation of all animate life as well as inanimate life like our universe as an organism in its own right. The organisational principle is the opposite to fromal mathematics. Deterministic systems like mathematics aspire to establish a chain of relations from the a priori to the a posteriori, from the axiom to the theorem, from cause to effect. This is the reason of the robot. The reason of life involves an organism hell bent on proving its own self-reliance by NOT being dependant on the a priori. The robot is driven by the a priori; The life form is driven from what it now is not by what it ever was.

However badly I may have explained it, just go back to the Gödel proposition G and see how Gödel handled it for mathematics. Then, take the opposite to that, and you have the Stoic paradigm in a nutshell…

Final Points


Robots are pre-programmed and life forms are not. But every biological life form is programmed by its genome is it not? Are not life forms just robots pre-programmed in their DNA?


I explain elsewhere that the genetic code is not a programming language. Mathematical language and all computer-programming languages encode diachronic structures. The most elementary diachronic structure is the Peano successor function that both Russel et al and Gödel used to generate the natural numbers. I claim that the genetic code is a calculus of physics and logic that is without number and so free of any successor function. No diachronic structure is allowed. It only codes the present. You have to read my book and other work to get a fuller grasp of that though. The genetic code is a non-diachronic coding technology, not a language in the usual sense.


Biological life forms are coded in the genetic code. How can the universe be seen as a life form when there is no sign of a genetic code?


I rename the genetic code the generic
code. Biological life forms I classify as animates. In animates the generic code is expressed as genetic material (DNA or RNA) separate from the functioning material. Organisms like our universe I call inanimates. In inanimates, there is no distinction between genetic material and functional material. It is all the one stuff. The four letter of the generic code, combined in triads correspond to the elementary particles in physics. On my web site, I have constructed an interactive database that shows this correspondence. Paper 4 is a draft of how that all works. (more is in the pipeline) In other words, the Standard Model of particle physics and much more, can be worked out from first principles by a reinvigorated Stoic natural philosophy as a kind of metaphysics.


Stoic physics is based on the ancient four-element theory of Empedocles. This theory has long been debunked and replaced by modern physics.


Quantum physics has come to the same kind of conclusion as Empedocles and, in particular, Heraclitus. There are four kinds of tension, four kinds of fundamental force in particle physics and quantum mechanics as is well accepted. Foursomes occur regularly throughout physics and even in mathematics. In Category Theory, there are four kinds of morphism, epi, mono, bi and iso. Aristotle’s syllogistic logic was the first to provide a logical basis for a four-element aspect to logic. One can explore that in the four terms of syllogistic logic in my Aristotle Engine on my website. However, the Stoic five indemonstrables provide a direct statement of the logic behind the four-element theory of matter. The third syllogism actually corresponds to the fifth Stoic element pneuma and can be used to construct the other four classic elements.

The Scholastics used the letter AEIO to label the four terms of the syllogism. Nature uses the letters ATGC in the genetic cum generic code to code the basic building blocks of life. In my writings I show how this relates to Stoic logic and the ancient four element theory of matter as well as the modern four force theory of physics.

Weirdness In Physics

There are two kinds of physics one classical one weird. Classical physics based on the pleasing common sense notion that there is a realist mind-independent reality “out there,” just waiting to be measured and described by precise, deterministic, mathematical models. For terminological convenience, I will refer to this kind of paradigm as left side. In this paper, I address the complementary opposite, the right side paradigm. The right side leads to physics that is far removed from common sense. This is the natural domain of quantum theory, a domain full of the apparent weirdness such as the quantum mysteries of uncertainty, the wave-particle duality revealed by the double slit experiment, entanglement, and so on. The rational foundations of the classical left side paradigm have reached a relative maturity and demonstrates a high degree of rigour concerning its foundations. The same cannot be said for the right side paradigm. Quantum theory with its apparent counter-intuitive weirdness and many mysteries is in dire need of formalisation. The objective of this paper is to provide such a foundation and even a language based calculus. The calculus provides an alternative to the Standard Model and provides a much more detailed account of the elementary logical structure of matter independent of the need for any empirical observations.

Like Odin, the ancient Norse god of thought and logic, present day physics achieves rational clarity by a simple astuce, that of being one eyed. According to present day orthodoxy, there is only one scientific paradigm. Science is mono-lateral, not bilateral. The paradigm must be fundamentally left side and thus realist. The apparent weirdness thrown up by quantum theory demands a quest for further refinement of existing orthodoxy, not another radically different paradigm switch. There is no right side science. This is not the position taken in this paper. One can still remain true to the one eyed Western tradition illustrated by the monocular Odin, god of reason. All one has to do is to use the other eye but not at the same time of course.

I argue that science must be bilateral. There must be two separate, fundamentally opposed but complementary paradigms.

The idea is far from novel. Bohm argued his own form of bilateralism with his notions of the Explicate and Implicate orders. Dirac saw the two paradigms of physics as one fundamentally temporal and the other as fundamentally spatial. The philosophers see it as an opposition between physics and metaphysics.

The general characteristics of the left side paradigm correspond to what is called the Scientific Method. As well as being fundamentally realist, the methodology is reductionist, atomist, and dualist. If there is going to be a symmetry between the two paradigms, the right side methodology one would expect the right side paradigm to be non-realist, non-atomist, and non-dualist, whatever these terms might eventually mean once formalised.

Underlying these dichotomies between realism and non-realism, dualism and monism and so on, there must be a fundamental dichotomy from which all others arise.

An intuitive, informal idea of the fundamental dichotomy is that between object and subject. Both the left side and right side paradigms embrace this same dichotomy right at the very core of their respective formalisms. However, they treat the subject object dichotomy in quite opposite ways. These two ways to treat the subject object dichotomy establishes a further dichotomy between the left and right paradigms. The difference between the left and right paradigm treatments of the object subject dichotomy is as follows. The left side paradigm articulates the epistemological configuration of the traditional classical sciences.

The left side paradigm starts with the subject-object dualism taken at the macro level that demands a pure realism where the object of study is objectified by eliminating all reference to the subject, which remains forever invisibly off stage. In empirical science, the object of the science is embraced in an environment called the controlled experiment, a subject free laboratory. The subject, in this context, becomes impersonal and has sometimes been referred to operate as the God’s eye view and even the view from nowhere, empirical scientists would probably prefer the interpretation of the view of the objective, dispassionate observer. Traditional mathematics objectifies its each of its problem domains by a controlled rational environment defined by a set of axioms. The subject is nowhere to be seen in axiomatic mathematics, as all mathematical entities are objects. Like in the empirical sciences, the implicit macro level impersonal is invisible in the formalism. Given an axiomatic system A, the only macro dualism in mathematics is the notion of the mathematically dual system A`. However, the allusive subject is nowhere to be seen in A` as, just like system A, the mathematical entities of A’ are all objects. Axiomatic mathematics is a two-headed coin.

The strength of Western culture is to be like Odin, one eyed. But which eye? Right eyed is to be left-brained. That is the way of present day science. The secret towards an integrated scientific view of reality may be to be one eyed but to change from one eye to the other, depending on circumstance or lack thereof. Only to be one eyed at the one time. Present day science is fixed right eyed, hence left brained. The alternative perspective is ignored. What is needed is a bilateral approach to science. Present day science is mono-lateral. Odin was much wiser than that, surely.

Logic Driven Science: Physics without Attributes

Lingam in sea

Physics, as it is presently construed, involves the study of physical phenomena.  This kind of science, I will call phenomenal physics.  Of central concern is the motion of physical bodies.  Classical Newtonian physics proposed the first version of the laws of motion of such bodies.  Einstein provided a second version that took relativity into account.  At the macro level, the laws of motion based on Special and General Relativity Theory are so accurate that for all intents and purposes they are generally considered as exact. However, at the quantum level of physical reality the deterministic laws of macro physics break down. The break down is dramatic. David Bohm remarks that at this level:

…there are no laws at all that apply in detail to the actual movements of individual particles and that only statistical predictions can be made about large aggregates of such particles. (Bohm, 1980)

The laws of motion for individual particles simply vanish at the quantum level. Quantum Mechanics takes up the challenge and provides the wave function as the necessary probabilistic way of predicting the phenomenon of individual particle behaviour.

At the level of elementary particles, phenomenal physics has virtually nothing to say about the state of affairs of any individual particle, except at the extreme instant of measurement.  The only two possible exceptions are at opposite ends of the phenomenal spectrum and are constants. These are rest mass and the speed of light which both appear to be stable measurables and are useful for scaling the system.  Any non- constant property of an individual particle is effectively quantifiably meaningless.  I will henceforth refer to these non- constant properties as attributes.

In this paper, I accept the scientific uselessness of the attributes of an individual particle. I then proceed to argue that it is useless to carry such burdensome luggage along in the formalism needed to understand elementary particles. Attributes only add unnecessary clutter to the science. After taking this dramatic step, we are naturally led to another kind of physics—physics without attributes. Physics without attributes is obviously a different breed of fish to traditional phenomenal physics.  For the want of a better name, I will call the science generic physics.

Generic Physics

Bohm argued that there
was another side to physics
– the Implicate Order.

The relationship between phenomenal physics and generic physics is somewhat like that imagined by Bohm in his Explicate Order and Implicate Order idea. The Explicate Order corresponds to traditional phenomenal physics which he saw as derivative of a higher, ultra-holistic , unifying  Implicate Order.  Bohm’s approach has many similarities with the one I have been developing in previous work. Like myself, he even refers to the left and right brain analogy.  In order to lighten the terminology, I will sometimes refer to the phenomenal,  “Explicate Order” as the “left side’ paradigm or point of view whilst  the generic,  “Implicative Order” side  as the “right side” paradigm. In this paper I will provide the necessary constructs to formalise the difference between the two paradigm and their formal nature, something that is missing from Bohm’s account.  As will be seen, my account of the right side paradigm is presented quite differently  to Bohm’s Explicate Order. 

For me, order is the affair of the left side paradigm, a paradigm shared by all the traditional sciences  including axiomatic mathematics. From an epistemological perspective, the left side “Explicate Order” sees reality as diachronic. The diachrony in mathematics is expressed at the elemental level as number. The diachrony of number is most forcibly expressed in Peano arithmetic in the form of five axioms essentially defining the successor function, the fundamental mathematical engine of diachrony. This was recognised by Russel and Whitehead in building their Principia Mathematica system, and equally by Gödel who brought it tumbling down. Intuitively, the diachronic nature of the left side paradigm can be thought of as a world view relating the a priori with the a posteriori.  The diachronic structure applies no matter what the science, or whether it is mathematics or logic.

Turning back to the much less familiar right side paradigm, Bohm sees this as a higher order form of organisation, his Implicate Order.  He still sees this holistic, unifying paradigm as an order, whatever that may mean. Moreover, he also still sees it as phenomenal physics albeit operating at a higher organisational level. The fragmented, localised perspective of the left side paradigm gives way to a flickering hologram[1] like image of reality.  Standing waves of interfering quantum fields determine what we see as particles, explains Bohm. The imagery has some merit but is missing in any rigorous formalising methodology.

My approach to the “Implicate Order” is not to see order at all, but its complete abolition. The diachrony gives way to a pure synchrony.  The perspective is that of the ancient Stoics who claimed that the only things that exist are those that exist synchronously with the subject. Objects in the past do not exist; neither do objects in the future. Only exist are the objects in the immediacy of now, relative to the subject,  To the materialist Stoics, the objects in existence must be material bodies being capable of acting or being acted upon.  From a Stoic perspective, Bohm’s Implicate Order takes place in the immediacy of a subject’s nowness.

How to get rid of attributes

Generic Physics is physics without attributes. Getting rid of attributes is one thing, but what can we replace attributes with? The answer to this little puzzle is surprising simple and as well as surprisingly profound.  We start by consider an entity which has a single attribute and examine the entity-attribute relationship.

First, take the diachronic traditional viewpoint of all the traditional sciences and axiomatic mathematics. According to the conventional wisdom of the left side doctrine, there is a distinct dichotomy between entities and attributes. No entity is ever an attribute nor any attribute ever an entity.  Then comes the problem of gleaning knowledge about the entity.  Conventional wisdom clearly would say that one cannot get to know the entity directly but only via its attribute. Thus any science pertaining to such entities must be attribute driven.  In other words, common sense declares that science, and hence physics, must be empirical in nature. This is the standard orthodoxy proclaimed by all left side science. There are no surprises there.

Now turn to the not so orthodox right side perspective.  This is the perspective that does away with the need for attributes.  In the left side scenario, the scene was occupied by an attribute with the corresponding entity hidden off-stage. Knowledge of the entity is gained by getting to know the antics of the on-stage attribute. In the right side scenario both the entity X and the attribute Y are on centre stage. The attribute is considered as an entity in its own right. Any specificity it may or may not convey is of no importance. What matters is the dialectical relationship between these two players.  This relationship is semantic.  The entity X will express its only known specificity, the fact that X has an attribute.  The entity Y will express its only known specificity, the fact that it is an attribute. To use expressions familiar in Computer Science, X expresses HAS-A semantics, whilst Y expresses IS-A semantics.  The basic idea in this right side science, is that one doesn’t care any more about the value of attributes. What matters is whether an entity is an attribute or has an attribute.

This IS-A, HAS-A construct leads to a generic way of typing entities. I call it the construct ontological  gender. An entity with HAS-A typing will be said to be of feminine gender and an entity with IS-A typing will be said to be of masculine gender.  Of fundamental importance is to realise that gender is not an attribute. Two entities of different attribute can be distinguished from each other by attribute comparison.   Two entities of different gender cannot be distinguished from each other by attribute comparison for the simple reason that there is only one attribute between them. One has it, the other is it. In what follows, I will show how this gender construct maps up with the ancient use of this construct in Stoic physics, and Stoic logic.

It’s a bit like traditional societies where
the family unit inherits the surname and clan
membership (inherit the social interface) from
the male IS-A line whilst the feminine turns up
with the dowry of ten cows (HAS-A).

One use of the IS-A and HAS-A construct in computer science is in the design of Object Oriented programming languages. The early OO language C++ allowed open slather multiple inheritance of entities with IS-A and HAS-A semantics. This was found to lead to bad programming practice. In the next generation of OO languages such as JAVA and C#, the languages were designed to only allow the single inheritance of IS-A semantics. Inheritance should be limited to the masculine line. For example, a Cadillac IS-A Car.  Also, a Cadillac HAS-A CD player, HAS-A engine etc. Whilst it is perfectly reasonable that the class of Cadillacs inherit the common interface of the class of Cars, it doesn’t make much sense for the Cadillac to inherit the interface of CD players or engines.

It’s a bit like traditional societies where the family unit inherits the surname and clan membership (inherit the social interface) from the male IS-A line whilst the feminine turns up with the dowry of ten cows (HAS-A). I find that fascinating but will not dwell on it. This is good programming practice in OO!

In quantum mechanics the famous BELL experiment demonstrated that, at the micro level there are no hidden variables, no intrinsic attributes. Attributes are only accidental and have no place in universal science. What matters is the qualification in terms of the universal IS-A and HAS-A qualifications. Quantum mechanics based on is-A and HAS-A quantum states is the way to go. I will be developing this theme in later posts and in a paper I am writing

A computer illustration of gender would be the placeholder-value dichotomy. Consider a standard 32 bit computer. The computer would have 4 gigabytes of addressable memory. Each of the 32 bit memory locations can store a value ranging from zero to 4 “gig”. From an attribute perspective, this computer is a cruncher of 32 bit numbers and it is hard to understand how it works. However, ignoring the specificity of the numbers, one can look at a computer as being organised along gender lines.  A placeholder for a value can be thought of as feminine, and the value contained as masculine. Consider now the contents of a general purpose register in the computer. What is the gender of the number contained in the register?  From the register point of view, the number is a contained value, and hence masculine. However, this number could also be interpreted as a pointer to a memory placeholder, and hence be interpreted as feminine. Is it a pointer or a value? Is it feminine or masculine? In actual fact, without knowing the complete context, there is no way of telling the difference. The gender status of the general purpose register could be said to “be in superposition.” Nevertheless, despite the fleeting nature of gender when viewed by a third party, we do now know that a computer is a system involving the dynamic organisation of value and placeholder semantics. However, this gender structure is extremely shallow in computers for this somewhat desperate example to get the reader beginning to seriously grapple with the gender concept. This is more an allegory than an example.

In summary, the gender construct provides an alternative to attribute based semantics. Gender semantics provide a qualitative alternative to the traditional quantitative approach. Of course, entities typed as having a single masculine or feminine gender are too ephemeral to be considered as discernable entities. However, the situation changes in the case of entities with mixed gender. Rather than considering gendered monads as the building block of the science, consider dyads where the each end of the dyad is simply gender typed as masculine or feminine. This leads to four possible binary gendered dyads MF, FM, FF, and MM.

Because gender is an attribute free construct, it is not restricted to the attribute specificity of any particular problem domain. It is a truly universal construct and can literally apply to any problem domain whatever.  Of particular interest in this paper is to associate gender with logic. My overall strategy is to exploit this universal gender logic as the logical foundation for physics.  The proof of the pudding will be to show how this foundational logic naturally leads to a generative scheme that enumerates the elementary particles of a logical physical reality. The approach is generic and independent on any specific attribute system. The predicted elementary articles would apply to any phenomenal reality as long as it is “logical.”

What are the logical properties of gender? In this quest one is immediately led to Aristotles Term Logic, the Syllogistic. The formal structure of the syllogism is quite simple. Each syllogism is made up of three terms, a Major, a Minor, and a Conclusion. There are four elemental forms called terms. It is not difficult to discern the implicit gender typing in this syllogistic system. Each term is binary typed. Aristotle doesn’t use a masculine-feminine dichotomy but a Distributed-Undistributed dichotomy. A subject or predicate is either Distributed or Undistributed. Thus the four possible term types are typed as DU, YD, UU, and DD.  The textbook make valiant attempts to explain whether a subject or predicate is distributed or undistributed or not. The best way is to simply see the distributed subject or predicate as expressing IS-A semantics and the undistributed expressing HAS-A semantics. In other words the distributed corresponds to masculine typing and the undistributed to feminine.

For a rapid refresh of syllogistic logic in this context, I recommend that the reader spend a few minutes with my online syllogistic machine.

However, the logical platform that we need to generate the elementary constituents will not be Aristotle’s Syllogistic logic but rather the closely related Stoic logical system that came later.  

[1] In previous work I explained how a weak version of the left and right side paradigms can be found in Heaviside’s Operational Calculus. On the left “time domain” side can be found time series and complicated calculus of differential equations. On the right “frequency domain” side can be found a simple algebra of functions of a complex variable calculable by Laplace Transforms.  Note that the Laplace transform F(s) of a continuous function f(t) has the “holiographic” mathematical property that given a finite sample of F(s), no matter how small, the rest of F(s) can be perfectly reconstructed.

Bilateral Science

This post is working towards a paper I will call Logic Driven Physics. At the moment, I believe that I am the only person in the world writing this story of how the science of the Stoics can be reverse engineered to provide a new, alternative take on physics, logic, and mathematics.

In this post, I consider physical reality as a system. I take a leaf out of system science where there is not one paradigm for understanding a system but two. I argue that the foundations of science, including physics and mathematics, must be bilateral. System science demands two takes on reality. One take is diachronic in nature, the other synchronic. In system science, the diachronic side employs ordinary calculus and studies time series whilst the synchronic side employs the operational calculus pioneered by Heaviside and sees its reality as having a “holographic” flavour nowadays in Laplace and Fourier transforms.

Heaviside photo

Heaviside pioneered
Operational Calculus

I sometimes like informally to refer to this dichotomy between the diachronic and synchronic as expressing “left side” and “right side” rationality respectively. Thus, one can imagine this bilateral architecture as two diametrically opposed but complementary hemispheres of a metaphorical epistemological brain.

Aristotle was the first to remark on the epistemological dichotomy of knowledge. He placed the traditional science on one side charactering them as all studying objects that have a determined genus. On the other side he placed an entirely different kind of science that was characterised by studying entities with completely undetermined genus. The latter science became known as metaphysics which, to Aristotle, was the science of Being, pure otology. Writing about metaphysics, Kant once bemoaned:

It seems almost ridiculous, while every other science is continually advancing, that in this, which pretends to be Wisdom incarnate, for whose oracle every one inquires, we should constantly move round the same spot, without gaining a single step. (Kant, 1781)

The same thing can be said in modern times with the plight of metaphysics now in disarray where metaphysics is often demeaned, even ridiculed by many scientists. The objective of this post is to correct the slide of metaphysics into scientific oblivion. My first step is to demystify the subject by citing the non-diachronic approach of Operational Calculus as an example of what I call weak metaphysics. According to my formulation, strong metaphysics must be strongly synchronous. This demands that all pertinent players must be simultaneously present in any whole. The Operational Calculus can represent a simple system as a whole. That is its speciality. However, these kinds of systems are made up of objects only. There are no subjects present in the synchrony. Strong metaphysics, as we shall see, demands that not only must all objects be present but also the subject.

A characteristic of weak metaphysics is that the relationship between the diachronic and the synchronic is deterministic. For example, for the relationship between calculus (diachronic) and the operational calculus (synchronic) can be actually calculated exactly by Laplace transforms. In strong metaphysics, an exact calculation is impossible—such relationships can only be known in terms of dispositions, not coordinates and determined quantities.

Despite the lack of individual subject, weak metaphysics such as Operational calculus does illustrate a number of important characteristics of a strongly metaphysical right side science. Of crucial importance is Aristotle’s original characterisation of metaphysics. Unlike the world of calculus, the objects that make up the world of Operational Calculus all have undetermined genus with respect to each other. In the diachronic domain, a simple system is made up of a conglomerate of entities of differing genus, such as inputs, outputs, and system behaviours. In the synchronic domain all such categorical distinctions vanish: all entities are represented in exactly the same way as functions of a complex variable. Using a term borrowed from Computer Science, one can say that all the entities in the synchronic domain are first class. Aristotle’s undetermined genus characterisation becomes a demand that all entities in the system must be first class. Operational Calculus also demonstrates another common characteristic of right side methodology. The first class entities form an algebra. All of the complicated operations in the diachronic domain can be expressed in this algebra providing great simplification.

Another weak metaphysics example is Geometric Algebra (GA). which provides an operational alternative to the traditional matrix and tensor dominated approach of linear algebra. In GA all entities are first class where tensors, matrices and vectors give way to the same kind of entity. Everything in GA becomes a geometric entity. Like in OC, the geometric entities form a simple algebra where, in the case of GA, the role of Grassmann’s geometric product is paramount. The work of Hongbo Li highlights this key aspect of this operational methodology (Li, 2008). Li applies the conformal aspects of GA methodology to provide remarkably simple automated proofs of geometric theorems. A key construct in his algorithms is to privilege as much as possible multiplicative operations at the expense of the additive. To Li, more additive operations mean more algebraic clutter and leads to what he calls mid-term-swell. On the other hand, more of the multiplicative means the retention of geometric meaning and results in great simplification. Li clearly demonstrates how automated proofs and geometric computation in general can be greatly simplified using his approach. With more traditional linear algebra and brute force Clifford algebras the resulting mid-term-swell can be so enormous that solutions become, at best, purely notional. Another key term emerging from Li’s work is the purely multiplicative polynomial, the monomial. The monomial expresses pure geometric semantics based on multiplication, free of additive algebraic clutter. In many cases, Li’s methodology resulted in expressing geometric concepts that distilled down to monomials leading to spectacularly simple solutions free of the dreaded mid-term-swell phenomenon that afflicts non-operational methodology. As will be seen further on, the monomial construct will turn out to be of fundamental importance in this project.

In passing, one should note that the modern formulators of GA such as David Hestenes as well as Li consistently claim GA to be the universal algebra of physics and mathematics (Hestenes, 1988). I concur with this appreciation of GA with the proviso of introducing a number of important ingredients reported in this post.

There is one other example of a weak metaphysics methodology that I will be examining in more detail further on. It might seem surprising that I put forward Gödel’s work on the Completeness Theorem and Incompleteness Theorems as such an example. His work is important for this project as it brings into play the logical dimension of metaphysics. moreover, the dichotomy between what is true and, more fundamentally, what is the truth. Of great significance is the fact that Gödel’s work is not mere metaphysical speculation as it takes place in the full glare of an ingenious mathematical formalism. More of that later.

Contribution of the Stoics

Operational Calculus and Geometric Algebra provide clear examples of operational methodology. They illustrate an important aspect of metaphysics in the sense of the first classness of the fundamental entities. However, they do not embrace the most fundamental aspect of including not just a science of object nut also a science of subject. In order to start getting a grasp of what is meant, I turn back to the philosophical terrain of Hellenistic times. The bilateral perspective that I am trying to explain, can be seen in the schism between the Epicurean and Stoic schools of thought of that time.

The diachronic left side take was advocated by the Epicureans. The Epicureans were atomists, and believed in a materialist, deterministic world view that is not incompatible with the view of traditional modern science. The exception to absolute determinism was the famous Epicurean Swerve construct whereby, according to the Epicurean doctrine, every now and then atoms would imperceptibly deviate from a strictly deterministic trajectory. In this way, the unstructured primordial universe somehow micro-swerved to evolve to the state it is today. In the broad sweep of the history of ideas, I see the Epicureans and their atomist forebears as early exponents of the left side, diachronic take on reality.

Of central interest in this post are the much less understood early exponents of right side non-diachronic reality. Here, I am talking about the implacable foes of the Epicureans, the Stoics The alternative right side approach, exemplified by the Stoics, concentrates on studying the world in between the a priori and the a posteriori, the world that exists now relative to the organism in question. For the Stoics, only corporeal bodies with extension exist. Only what exists can act upon and be acted upon. Objective reality is sandwiched between the a priori and the a posteriori. To the Stoics, things in the past or in the future do not exist. It is only what exists now, relative to the organism in question. Heroes of the Now, the Stoics had no fear of anything in the past or the future; as such, things simply do not exist.

As Hahm remarks “For half a millennium Stoicism was very likely the most widely accepted worldview in the Western world.” (Hahm, 1977) However, it was the world view of the diametrically opposed Epicureans that best corresponds to the present day analytic, diachronic world view of our time, not that of the Stoics. Moreover, Stoic physics, according to my characterisation, is not physics as the moderns understand it but metaphysics. As such, their perspective on reality should be operational. This is indeed the case as Stoic physics ticks all the boxes in providing an operational perspective on reality. First of all, Stoic reality is articulated in terms of first class entities according to the mantra: everything that exists is a material body. For the Stoics, the property of an entity was also an entity in its own right thus guaranteeing that entities are first class. Thus in Stoic physics, properties are also material bodies. As for the entities forming an algebra, at least the Stoics identified the letters of the algebra in borrowing the four primordial letter alphabet of Empedocles. This necessarily leads to acceptance of the ancient four-element theory of matter where each primordial element corresponds to one of Empedocles’ four “root” letters.

The Stoics also borrowed from Heraclitus. Heraclitus saw everything in terms of oppositions. Each of the four elements expressed a primordial tension between opposite poles of an opposition. These elements were called Air, Water, Earth, and Fire. Air represented an expansive tension. Water a contractive tension corresponding to the images evoked by such naming. Earth would (or should, according to me) have been seen as an unsigned tension between two different extensions. Earth would have been seen as an unsigned tension between two different (but indistinguishable) singularities. Physical reality for Heraclitus could thus be interpreted as the interplay of these four primordial tensions. Heraclitus saw these primordial tensions as four instances of one single even more primordial tension called pneuma. Thus, the four element theory became a five element theory of sorts.

Category Theory and the Five Morphisms

To modern eyes, the ancient four element theory might seem like abstract nonsense. However there is a branch of mathematics that sometimes actually prides itself on its “Abstract Nonsense,” viz. Category Theory. Category Theory, despite being encased in a diachronic axiomatic framework, also reveals operational aspirations. Its first classness is expressed in the mantra: Everything is a morphism. Morphisms can be represented by arrows and so Category Theory sees its reality in terms of dyads, not monads as does straight pure and simple Set Theory. Category Theory rediscovers Heraclitus’s four kinds of tension in terms of four distinct kinds of morphism. Instead of Air, Water, Earth, and Fire, Category comes up with four kinds of morphism, the epimorphism, monomorphism, bimorphism, and isomorphism. In Set Theory these morphisms become functions. For functions, there is no difference between bimorphisms and isomorphisms. Note also the “expansive” nature of an epi, the “contractive” nature of a mono, and that the inverse of a bi or iso is a bi or iso, much as Heraclitus would have expected.

The vocation of Category Theory is to study mathematical structures which are common to all mathematics. Thus one could say that that these four kinds are morphisms constitute the stuff that mathematics is “made of.” Note also that there is an even more primordial morphism in Category Theory than these four, the natural transformation. Saunders Mac Lane, cofounder of Category Theory, once stated that he invented Category Theory in order to study natural transformations. Natural transformations take up the fifth spot in a “five element theory of mathematics.”

Stoic Logic

The Stoics embraced Heraclitus’s theory of the five elements and the primordial tensions they convey and incorporated it as the basis for their physics. The Stoics claimed that their philosophical system included physics together with logic and ethics to make up a harmonious whole. However, as de Lacy back in 1945 commented:

One of the many paradoxes associated with Stoicism is the puzzling circumstance that although the Stoics themselves claimed that their philosophy was a perfectly unified whole – so well unified indeed that its various parts could not be separated from one another, and the change of a single item would disrupt the whole system, yet the opponents of Stoicism, even in ancient times, regarded the Stoic philosophy as a mass of inconsistent and incompatible elements. Since much of our information about Stoicism comes from hostile sources, it is much easier for the modern investigator to find the inconsistencies of Stoicism than its unity. In recent years there have been a number of studies attempting to find the unifying element, but the problem is by no means solved. (de Lacy, 1945)

The situation hasn’t advanced much since then. In this post based on previous work, I provide the unifying element for the Stoic system. For the moment, I will simply point out the structural similarities between Stoic physics and Stoic logic.

Stoic logic in its entirety covered a vast range of subject matter ranging from rhetoric to dialectics including many subjects that would not be regarded as logic from a modern perspective. However, for the purposes of this post we need only consider the core logical system. For the Stoics, rational reality was subject to the logical principles of the Logos L. The Stoic interpretation of the Logos L was in the form of their system Ls based on the five indemonstrables, considered in detail later. A simplistic interpretation of Stoic Logic Ls is to see it as the first historical example of the propositional calculus. In other words it expresses the zero order logic of particulars. In later work, I intend to show further on that Ls can be thought of as a first order logic with powerful spacetime-like geometric semantics Gs.

However, for the moment we must be content with a cursory description of how each of the five indemonstrables map to the corresponding element of the Stoic-Heraclitus physics system Ps.Thus, the question is: how does the Stoic system unite physics with logic? More precisely, how does Stoic logic Ls based on the five indemonstrables relate to the Stoic five element theory of substance Ps? The relationship Ls Ps has already been reported from several different perspectives in previous papers. The essence of the relationship is illustrated in Figure 1.

Figure 1 Illustrating the Stoics relationship Ls Ps
and the corresponding Heraclitus diagrams.

Stoic physics adopted the four element system of Empedocoles, including the gender typing. The gender construct is explained in my previous works and will be further explained further on in this work. I technically refer to it is ontological gender. Gender is the key to understanding how all of this fits together. There is a learning curve for appreciating the full extent and subtleties of the gender construct the most subtle of all distinctions. For the moment, think of the masculine as expressing pure form. The purest and most primordial expression of form is the singularity. Expressed linguistically, the masculine is pure “is-a.” On the other side of the gender divide is the feminine which, in isolation, can be thought of as pure formless extension. Linguistically, the feminine is pure “has-a.” The gender calculus (yes it does form a calculus) expresses the dialects of the is-a, and has-relationship. As I said, this is the most subtle of all distinction. It is also the most fundamental.

To be expanded upon….


Kant, I., 1781. The Critique of Pure Reason. s.l.:The Project Gutenberg EBook:

Moore, D. J. H., 2012. The First Science and the Generic Code. Parmenidean Press. 450 Pages
Moore, D. J. H., 2013a. Now Machines
Moore, D. J. H., 2013b
The Whole Thing is a (Now) Number
Moore, D. J. H., 2013d. Logic Driven Physics: How Nature’s genetic code predicts the Standard Model.
Moore, D. J. H., 2013. The Universal Geometric Algebra of Nature: Realising Leibniz’s Dream
Moore, D. J. H., 2013. Generic Model versus Standard Model Interactive Database. [Online Database Application]

Reverse Engineering the Genetic Code

The post is a slightly edited version of a submission I recently made for Challenge prize competion. I didn’t win it but he submission provides a reasonable and short overview of my project.


genetic code image

Reverse Engineering the Genetic Code

understanding the universal technology platform of Nature

Executive Summary

My proposed platform technology for advancing the life sciences is none other than the genetic code itself. Even though all life forms evolve over time the universal language that codes them remains virtually unchanged over billions of years. If one wants to find a fundamental platform for exploring and explaining life, the answer is already there in this universal language of Nature. The Central Dogma of biochemistry infers that the genetic code is a mere transcription language. My project challenges the dogma with the central claim that the four letters of the genetic code express logico-geometric, spacetime-like semantics. In fact, the four letters (A,T,G,C} express timelike, lightlike, spacelike, and singular-like semantics respectively. A central aim is to reverse engineer the code from first principles. In so doing, the code becomes the operational calculus for explaining the organisational principles of life.

The broad idea is not new and was envisaged by Leibniz over three centuries ago. In a famous passage, he sketched out his dream of developing a geometric algebra without number based on only a few letters that would simply and non-abstractly explain the form of the natural things of Nature. One could say that Leibniz anticipated the genetic code. However, his vision went much further than that. He claimed that the resulting algebra would have logico-geometric semantics and so his vision becomes quite revolutionary. Even more revolutionary still, he claimed that the same geometric algebra would explain, not just the animate, but also the inanimate. We now know that the organising generic material of biological organisms is distinct from the functional material of the organism. In the inanimate case of an “organism” like our universe, there appears to be no observable distinction between organising substance and the organised. Thus, if Leibniz’s vision is valid for the inanimate, then the elementary particles of Particle Physics should be directly and simply explained in terms of the four-letter algebra of the genetic code—now playing the role of a truly universal generic code. For inanimates like our universe, the organising material and the organised are the same stuff.

My project involves making Leibniz’s vision tractable in developing his Analysis Situs geometry without number in order to provide the logico-geometric semantics of the genetic code. My ideas have rapidly matured over the past year resulting in the publication of one book and the drafts of four long papers on the subject. The third “Leibniz paper” is the most pivotal. The rough draft of the fourth paper shows how the same genetic code organisation predicts the Standard Model of Particle Physics and even surpassing it. Because of its non-empirical nature, my Leibniz style methodology can predict not only the explicitly measurable particles but also the implicit, which may be impossible to observe empirically.

The Big Picture

This project takes a leaf from nature and provides a bilateral approach to science. There are two takes on Nature, requiring two “hemispheres” of knowledge. I refer to present day sciences as left side sciences. Left side sciences specialise in explaining the a posteriori in terms of the a priori. The empirical sciences harvest data and develop compatible theories to predict future outcomes. Axiomatic mathematics works deductively from a priori axioms to prove a posterior theorems.

The alternative right side approach, exemplified by the Stoics, concentrates on studying the world in between the a priori and the a posteriori, the world that exists now¾relative to the organism in question. For the Stoics, only corporeal bodies with extension exist. Only what exists can act upon and be acted upon. Thus, the Stoic perspective is that objective reality is sandwiched between the a priori and the a posteriori. The perspective is comparable to Leibniz, albeit more materialist.

Objective reality of an organism is anchored in the immediacy of its Nowness. I call machines based upon this principle Now Machines. I claim that all animates and inanimates are based on the Now Machine principle. The underlying principle is that the organism must not be subject to any extrinsic a priori principle. Borrowing a term from Computer Science, I call the principle First Classness (FC). The dominating principle of Now Machines is the non-violation of FC. The logic involved is similar to the Liar Paradox construct that Gödel used to prove that (left side) mathematics is incomplete. In right side mathematics, it becomes the organisational, self-justifying principle of Now Machines.

The mathematics of corporeal bodies acting and being acted upon leads to a particular kind of geometry with direct historic roots to Leibniz. As succinctly explained by Hongbo Li:

Co-inventor of calculus, the great mathematician G. Leibniz, once dreamed of having a geometric calculus dealing directly with geometric objects rather than with sequences of numbers. His dream is to have an algebra that is so close to geometry that every expression in it has a clear geometric meaning of being either a geometric object or a geometric relation between geometric objects, that the algebraic manipulations among the expressions, such as addition, subtraction, multiplication and division, correspond to geometric transformations. Such an algebra, if exists, is rightly called geometric algebra, and its elements called geometric numbers. (Li, 2008)

Li together with David Hestenes and other exponents claim that Geometric Algebra (GA) is the universal language of mathematics and science and so realises Leibniz’s dream. I consider their claim premature as it ignores two vital aspects of Leibniz’s vision. The claim ignores the truly universal genetic code of Nature “based only on a few letters.” In addition, although GA is not based on coordinates, it is still relies on ordinary numbers under the hood. Such a number scheme imposes absolute extrinsic ordering relationships from outside the system and so violates FC. I propose a solution founded on the ancient construct of ontological gender. The pure feminine gender entity is considered to have an attribute, albeit undetermined. The pure masculine gender type is that attribute as an entity in its own right. Thus two entities, the feminine has an attribute, the masculine is that attribute. The feminine corresponds to pure geometric extension, the masculine to geometric singularity. These are the two building blocks of Now Machines. With gender, the genetic code letters {A,T,G,C} can be expressed by the four binary genders {MF,FF,FM,MM}. Viewed from outside the system, genders are indistinguishable and so appear to be in superposition opening the way to Quantum Mechanics interpretations. Like Doctor Who’s Tardis on TV, a Now Machine appears bigger on the highly tuned and coded inside than the amorphous mass of superposition seen from the outside. The algebra of gender can replace the algebra of ordered numbers to provide a true “geometry without number.” The gendered version of GA articulates the dynamic geometric semantics of the genetic code and provides the final realisation of Leibniz’s dream.


New Science: Nature abounds with bilateral structures and asymmetries that remain unexplained by present day science. For example, why are all biologically produced L-amino acids left handed? In the inanimate realm, why are there no right-handed neutrinos? In order to address these kinds of question, a new kind of science is necessary. Not only must science explain bilateralism in Nature, but also the science must itself take on a bilateral epistemological architecture. Like the biological brain, science must develop two distinct but complementary takes on reality. In modern times, there has only been one “left side” science. This project unearths the complementary “right side.”

Overcoming Barriers: Nature herself has technological differences but no ontological barriers. The new right side science I propose unifies the science of the inanimate with the animate. “Life is everywhere,” so to speak.

Public Impact: Left side science got off the ground with Leibniz and Newton’s discovery of calculus, the ultimate public impact of which is incalculable. Right side science starts with the discovery of how the genetic code harbours the geometric calculus and semantics of life systems ranging from the animate to the inanimate. The public impact would surely be comparable.

Science Deficits: Psychologists have discovered that a patient with only a fully functional left-brain may exhibit bizarre behaviour like only eating food on the right side of the plate. They call it hemineglect. I claim that left side mathematics also suffers the same “cognitive deficit. The phenomenon can be traced to left side geometry, which only needs timelike and spacelike lines to work. In other words, the geometry only uses the two-letter alphabet {A,G}. It only fires on two cylinders! The right side geometry is based on the genetic code letters {A,T,G,C} and so, like its right side hemisphere biological counterpart, is cognizant of both sides of a bilateral world. Thus in some cases better instrument technology in left side science will be pointless because of the hemineglect blind spot of left side mathematics—and the mathematician will never know.

Both right side science and its right brain counterpart suffer a different kind of deficit. They are mute. However, although communication to outside the system is impossible, the right side can communicate with itself. That is what the universal language of Nature is for.


Present orthodoxy sees living organisms as results of evolution. Thus, man is the product of millions of years of genetic accidents. He is a genetic freak. The alternative right side science view is that the very essence of life is present from the very beginning. As foreseen by Leibniz, there is a universal algebra articulating the same life essence shared by all beings, ranging from the neutrino, the quark, the amoeba, through to man. In this context, man emerges from a universal principle, a much more noble scenario than being a genetic freak.

Some novel points:

  • Science should be bilateral like the two brain hemispheres.
  • Everything from the ground up can be explained in terms of gender
  • The letters{A,T,G,C} of the genetic code correspond to the binary genders {MF,FF,FM,MM}
  • The organisational principle of life is based on a form of the Liars Paradox
  • Leibniz was right on the money. The Stoics also had the right mind set.

Risk and Challenges

If this kind of science were to be fundamentally intractable, as many claim, then the project would be doomed to failure. After many decades of effort, my four draft papers demonstrate tractability and hence remove that risk.

The challenge of developing the new mathematics required is quite daunting and I need help. One sub-project, possibly even Nobel Prize material, is to explain the so-called degeneracy of the genetic code at least in the biological realm. My approach is that each codon codes an elementary geometric form. According to my theory, the start codon ATG expresses the Lorentz semantics of Special Relativity where the codon is made up of a single timelike A, lightlike T, and spacelike G form. Such a composite geometric form can be considered homogeneous and so satisfy FC. Hence, no need for degeneracy. The only other non-degenerate codon is TGG. TGG codes the semantics of a de Sitter space, which has known General Relativity interpretations and is homogenous. I claim that, for homogeneity compliance, all other elementary forms must be appended with extra dimensions. Hence the degeneracy for all codons



Li, H., 2008. Invariant Algebras and Geometric Reasoning. Singapore: World Scientific Publishing.
Moore, D. J. H., 2012. The First Science and the Generic Code. Parmenidean Press. 450 Pages
Moore, D. J. H., 2013a. Now Machines
Moore, D. J. H., 2013b The Whole Thing is a (Now) Number
Moore, D. J. H., 2013d. Logic Driven Physics: How Nature’s genetic code predicts the Standard Model.
Moore, D. J. H., 2013. The Universal Geometric Algebra of Nature: Realising Leibniz’s Dream
Moore, D. J. H., 2013. Generic Model versus Standard Model Interactive Database. [Online Database Application]


What is Gender?


There is no construct in science more fundamental than gender. The ancients knew this but the moderns have long since forgotten it.

This post will explore the epistemological and ontological potential of gender in providing a unifying foundation for science and mathematics. In this respect, the structure of the French language provides a first glimpse of the relationship between knowledge and gender. French tends to explain concepts in terms of oppositions, often expressed across opposing genders. For example, French for knowledge is the feminine term la connaissance. The natural corresponding opposition in French is the masculine le savoir. Someone with a lot of specialised connaissance or knowledge is a connasseur. The most extreme kind of the legendary idiot savant, the one who can digest the contents of the Yellow Pages in one sitting. On the opposite side of the fence is the savant of the non-idiot kind. The most gifted savant of all time was the equally legendary Socrates who had no reliable knowledge whatsoever as expressed in his Confession of Ignorance. However, he knew that fact with absolute certainty, a mark of the true savant. It is quite ironic that the Socratic Confession of Ignorance provides the key principle in developing algebra capable of integrating pure ignorance with pure certitude in a tractable manner, as we shall see.

Including axiomatic mathematics, all of the traditional modern day sciences are of the ordinary, common sense, analytic, fact-based, “connaissance” style of scholarship. These sciences are all well known as deductionist, abstract, atomist, and dualist. Employing the metaphor of the biological brain, we will refer to these sciences as instances of the left side scientific paradigm. The position we take in this paper is that left side paradigm is totally unsuited to provide a foundational science. Any unifying foundational science must be based on savoir, not connaissance. The savoir kind of scholarship we refer to as right side science. Our first task will be to explain the central role of gender in right side science.

Different natural languages implement gender in various grammatical ways. For example, Tagalog of the Philippines is remarkable for its complete absence of grammatical gender Even personal pronouns are neuter and so do not explicitly expose the sexual gender of the respondent. At the other end of spectrum is Jingulu, an Aboriginal language of Australia that has four genders. It is also interesting to note that Jingulu, like other Aboriginal languages, does not categorically distinguish nouns from adjectives, they all collapse into a broader category of nominals. In this paper we introduce the study of a code like language where even the categorical distinction between nominal and verb. and any other grammatical category, all such distinctions evaporate. The syntax becomes so generic that it virtually disappears. We call this language the generic code. We propose this language as the calculus for right side science. All natural languages are left side languages. There is only one right side language, the generic code. We will show how the semantics of this generic code can be reverse engineered from generic principles. With great trepidation, we also claim that this reverse engineered language provides the semantic foundations of the biological genetic code. In other words, the genetic code is an instance of a totally universal, generic code. This generic code is not subject to evolution. It must be in place right from the very beginnings of whatever might start to begin. We will show that the most salient feature of this generic cum genetic code is that, like Jingulu, the language is based on four genders.
Before attempting to tackle the problem of developing a generic language, we must look at the generic problem domain in which it is to operate. Generic language is to provide the calculus for a generic science. What is the nature of such a science?
Continue reading “What is Gender?”

Syllogistic Logic

Traditional sciences and mathematics is very “left brained” – abstract, dualist, empirical, atomist, and rely on a rhetorical form of reasoning. In antiquity, the Epicureans priveledged that form of thought. The Stoics favoured a non-dualist, non-atomist, dialectical form of reasoning. When it comes to Aristotle, such a dichotomy is not at all clear cut. As well as being the greatest philosopher of all time, Aristotle was also the greatest fence sitter of all time. With him, our neat dichotomy between left side and right side thinking meets a blank. This man had a foot firmly placed on both sides. Nowhere is this more apparent than with his categorical logic and in particular his square of oppositions. In this section, without going into too much detail, we summarise the aspects that immediately concern our project.

Figure 1 The four kind of terms. The Scholastics later labelled them with four letters.

The Four Terms and the Left Side

Aristotle’s syllogistic term logic was half modern and half ancient. We will suspend judgment on which was the better half. The modern half is exhibited in two ways: it relies on abstraction and is involved with propositions expressible in natural language. The abstraction can be seen in the use of the existential qualifier “All.” “All men” for example, means every man. By referring to “all men” or every man, one is referring to an abstraction, a generalisation. As the Stoics pointed out, abstractions and generalisations do not exist as real entities. In addition to abstraction, there is the fact that the logical representation of these syllogisms can be covered by Venn diagrams as shown below. The terms can be said to have “Venn Diagram” semantics.

Both of these aspects, the abstract and static nature of the logic, are characteristics of left side thinking. By default, left side thinking has become synonymous with the modern.

The Four Terms and the Right Side

However, what is not modern in Aristotle’s logic is that his infrastructure of the four kinds of terms is not determined by a set of axioms, but rather by a pair of oppositions and the opposition between these oppositions. This is exactly the approach we have been using to construct our semiotic squares in other sections of the blog. Firstly, obtain a pair of oppositions. Employ one opposition to define a left-right dichotomy and the other opposition for the front back structure.

Figure 2 Venn diagrams for the four terms of Aristotle

In Aristotle’s case, the left-right dichotomy is a strict logical opposition between the affirmative form and the negative. The second opposition is between the universal and the particular. Both these oppositions must be true dichotomies in order to construct a non-trivial semiotic square. This is a technical point, but a very important one and will be discussed later when considering Aristotle’s square of oppositions. It turns out that there can be certain cases where an opposition is not a true dichotomy. This can occur when the subject of a term has no existential import. In other words, when dealing with empty sets such as “All centaurs.”

Figure 3 The semiotic square for the four terms of Aristotle’s Syllogistic logic. The square is formed from two oppositions, the negative/affirmative, and the universal/particular.

Term Logic

During the middle ages, the scholastics labelled the four kinds of terms with the four letters A, I, O, and E. Syllogisms consist of three propositions, a major, a minor, and a conclusion. Each syllogism could thus be labelled by a triplet of letters taken from the four-letter AIOE alphabet. This fascinated the Scholastics and, many years ago, entertained the author’s curiosity for some time. The reason for the author’s interest was that such a system did have some resemblance to the triadic structure of codons in the genetic code. With a bit of effort, one can make some kind of rapprochement between the AIOE alphabet of the scholastics and the genetic-cum-generic AUGC alphabet, but the effort is probably not justified, as there are richer pickings elsewhere, notably in Stoic logic.

The genetic codon structure only has 64 combinations. What we have ignored for the Aristotle’s syllogism is the detail of how the three propositions in each syllogism hook together. We have ignored the fact that there are four different figures of the syllogism. Thus, taking into account the four figures, instead of 64 possible syllogisms there will be 256. Only nineteen of these syllogisms are regarded as leading to a valid conclusion.

Aristotle’s syllogistic logic provides a logical tool that is applicable to the contingent world. Unlike modern logic, it also brings with it some nontrivial semiotic infrastructure, the square of oppositions.

The Square of Oppositions

Aristotle described how the four kinds of terms could be placed in a square illustrating the various oppositions between them. He then went about characterising each kind of opposition, although the subalterns were not mentioned explicitly. The oppositions between universal statements are contraries. Contraries have the property that both cannot be true together. One may be true and the other false. It is also possible that both can be false together. On the other hand, subcontraries involve oppositions between particulars. In this case, both cannot be false together.

Figure 4 (a) The modern logic version of the oppositions. (b) Aristotle’s square of oppositions.

The Modern Square of Oppositions

Of great interest to us is an opposition at a higher level altogether, the opposition between Aristotle’s syllogistic structures and modern logic. The dramatic difference between the two approaches was clearly illustrated by George Boole, in what has become the modern version of the Square of Oppositions.

Modern logic differs from the ancient logic by simply replacing the universal with the general, in other words with the abstract. This can be achieved by using labels and the logic becomes symbolic logic. Thus, the term ‘All men’ is replaced by the abstract version ‘All X’. The thing gets replaced by a label and introduces different semantics. One could say that the semantics go out the window and are left trivialised. The label becomes simply a placeholder and as such, like any placeholder, may be empty. The logicians explain this as relaxing the requirement of existential import. From a classical mathematics perspective, the generalisation introduced by modern logic is to allow sets to be empty. This allows modern logic to talk about things that are known not to exist, a characterising feature of abstraction.

Once the reasoning becomes abstract, the logical difference between yellow centaurs and canaries evaporates. Not only that, but all the oppositions except the contradictories have also evaporated. For example, both sides of the contraries opposition ‘All centaurs are yellow’ and ‘No centaur is yellow’ are true. The contraries opposition has evaporated.

Figure 4 (a) shows the resulting modern logic version of the square of oppositions. The square has virtually collapsed and only the contradictories and the subcontraries survive. We have deliberately drawn the modern version on the left side relative to Aristotle’s square to illustrate that this is the left side variant of logic. The other variant is Aristotle’s seed for the right side version. The left side involves abstract, symbolic logic. The right side in the diagram represents Aristotle’s version of elementary generic logical structure. In practice, the modern symbolic logic approach boils down to a simple bipolar nominalism where the basic opposition is between two particulars, I and O. The letters A and E act as pure label signifiers for the I and O respectively, acting as the signified. The contradictory oppositions A-O and E-I model the relationships between signifier and signified. In essence, the system becomes a simple two letter system labelled by A and E. Thus, although we have not shown that modern day logicians only use half a brain, we are starting to see that they reason using only half of Nature’s alphabet.




Where is the centre of the universe?

This is a post from the Stoic mailing list at Yahoo Groups. It touches on a central tenet of Stoicism.

Jan wrote:

It’s certainly traditional Stoic doctrine that somehow connected with the all-pervading Logos (=Zeus=Nature=Providence=designing fire) is the obligatory law of nature, aka jus nature; the mind of the (human) sage is, according to classical Stoicism, aligned with this law of nature. That’s a bit too mystical for me (although it was convenient enough for the ancient Stoics.)

. Continue reading “Where is the centre of the universe?”

The Shape of Mind

brain two hemispheress

This section is about multiplication. In the large sense, multiplication brings two things together to make a third. In the case of numbers, this leads to simple arithmetic. In the case of two inebriated men at a bar, it can lead to a bar room brawl. The ancients, both in the West and the East, were interested in bringing two principles together, one masculine and one feminine. The multiplication of these two principles created the Cosmos. We will visit the ontological and epistemological roles of gender later. For the moment, we are interested in multiplying together two different ways of thinking, two different takes on reality. In mathematics, there are so many different kinds of multiplication that it can be very overwhelming. We are particularly interested in the role of multiplication in geometry. There is one kind of geometry that is pertinent.. Continue reading “The Shape of Mind”