Logic Driven Science: Physics without Attributes

Lingam in sea

Physics, as it is presently construed, involves the study of physical phenomena.  This kind of science, I will call phenomenal physics.  Of central concern is the motion of physical bodies.  Classical Newtonian physics proposed the first version of the laws of motion of such bodies.  Einstein provided a second version that took relativity into account.  At the macro level, the laws of motion based on Special and General Relativity Theory are so accurate that for all intents and purposes they are generally considered as exact. However, at the quantum level of physical reality the deterministic laws of macro physics break down. The break down is dramatic. David Bohm remarks that at this level:

…there are no laws at all that apply in detail to the actual movements of individual particles and that only statistical predictions can be made about large aggregates of such particles. (Bohm, 1980)

The laws of motion for individual particles simply vanish at the quantum level. Quantum Mechanics takes up the challenge and provides the wave function as the necessary probabilistic way of predicting the phenomenon of individual particle behaviour.

At the level of elementary particles, phenomenal physics has virtually nothing to say about the state of affairs of any individual particle, except at the extreme instant of measurement.  The only two possible exceptions are at opposite ends of the phenomenal spectrum and are constants. These are rest mass and the speed of light which both appear to be stable measurables and are useful for scaling the system.  Any non- constant property of an individual particle is effectively quantifiably meaningless.  I will henceforth refer to these non- constant properties as attributes.

In this paper, I accept the scientific uselessness of the attributes of an individual particle. I then proceed to argue that it is useless to carry such burdensome luggage along in the formalism needed to understand elementary particles. Attributes only add unnecessary clutter to the science. After taking this dramatic step, we are naturally led to another kind of physics—physics without attributes. Physics without attributes is obviously a different breed of fish to traditional phenomenal physics.  For the want of a better name, I will call the science generic physics.

Generic Physics

Bohm argued that there
was another side to physics
– the Implicate Order.

The relationship between phenomenal physics and generic physics is somewhat like that imagined by Bohm in his Explicate Order and Implicate Order idea. The Explicate Order corresponds to traditional phenomenal physics which he saw as derivative of a higher, ultra-holistic , unifying  Implicate Order.  Bohm’s approach has many similarities with the one I have been developing in previous work. Like myself, he even refers to the left and right brain analogy.  In order to lighten the terminology, I will sometimes refer to the phenomenal,  “Explicate Order” as the “left side’ paradigm or point of view whilst  the generic,  “Implicative Order” side  as the “right side” paradigm. In this paper I will provide the necessary constructs to formalise the difference between the two paradigm and their formal nature, something that is missing from Bohm’s account.  As will be seen, my account of the right side paradigm is presented quite differently  to Bohm’s Explicate Order. 

For me, order is the affair of the left side paradigm, a paradigm shared by all the traditional sciences  including axiomatic mathematics. From an epistemological perspective, the left side “Explicate Order” sees reality as diachronic. The diachrony in mathematics is expressed at the elemental level as number. The diachrony of number is most forcibly expressed in Peano arithmetic in the form of five axioms essentially defining the successor function, the fundamental mathematical engine of diachrony. This was recognised by Russel and Whitehead in building their Principia Mathematica system, and equally by Gödel who brought it tumbling down. Intuitively, the diachronic nature of the left side paradigm can be thought of as a world view relating the a priori with the a posteriori.  The diachronic structure applies no matter what the science, or whether it is mathematics or logic.

Turning back to the much less familiar right side paradigm, Bohm sees this as a higher order form of organisation, his Implicate Order.  He still sees this holistic, unifying paradigm as an order, whatever that may mean. Moreover, he also still sees it as phenomenal physics albeit operating at a higher organisational level. The fragmented, localised perspective of the left side paradigm gives way to a flickering hologram[1] like image of reality.  Standing waves of interfering quantum fields determine what we see as particles, explains Bohm. The imagery has some merit but is missing in any rigorous formalising methodology.

My approach to the “Implicate Order” is not to see order at all, but its complete abolition. The diachrony gives way to a pure synchrony.  The perspective is that of the ancient Stoics who claimed that the only things that exist are those that exist synchronously with the subject. Objects in the past do not exist; neither do objects in the future. Only exist are the objects in the immediacy of now, relative to the subject,  To the materialist Stoics, the objects in existence must be material bodies being capable of acting or being acted upon.  From a Stoic perspective, Bohm’s Implicate Order takes place in the immediacy of a subject’s nowness.

How to get rid of attributes

Generic Physics is physics without attributes. Getting rid of attributes is one thing, but what can we replace attributes with? The answer to this little puzzle is surprising simple and as well as surprisingly profound.  We start by consider an entity which has a single attribute and examine the entity-attribute relationship.

First, take the diachronic traditional viewpoint of all the traditional sciences and axiomatic mathematics. According to the conventional wisdom of the left side doctrine, there is a distinct dichotomy between entities and attributes. No entity is ever an attribute nor any attribute ever an entity.  Then comes the problem of gleaning knowledge about the entity.  Conventional wisdom clearly would say that one cannot get to know the entity directly but only via its attribute. Thus any science pertaining to such entities must be attribute driven.  In other words, common sense declares that science, and hence physics, must be empirical in nature. This is the standard orthodoxy proclaimed by all left side science. There are no surprises there.

Now turn to the not so orthodox right side perspective.  This is the perspective that does away with the need for attributes.  In the left side scenario, the scene was occupied by an attribute with the corresponding entity hidden off-stage. Knowledge of the entity is gained by getting to know the antics of the on-stage attribute. In the right side scenario both the entity X and the attribute Y are on centre stage. The attribute is considered as an entity in its own right. Any specificity it may or may not convey is of no importance. What matters is the dialectical relationship between these two players.  This relationship is semantic.  The entity X will express its only known specificity, the fact that X has an attribute.  The entity Y will express its only known specificity, the fact that it is an attribute. To use expressions familiar in Computer Science, X expresses HAS-A semantics, whilst Y expresses IS-A semantics.  The basic idea in this right side science, is that one doesn’t care any more about the value of attributes. What matters is whether an entity is an attribute or has an attribute.

This IS-A, HAS-A construct leads to a generic way of typing entities. I call it the construct ontological  gender. An entity with HAS-A typing will be said to be of feminine gender and an entity with IS-A typing will be said to be of masculine gender.  Of fundamental importance is to realise that gender is not an attribute. Two entities of different attribute can be distinguished from each other by attribute comparison.   Two entities of different gender cannot be distinguished from each other by attribute comparison for the simple reason that there is only one attribute between them. One has it, the other is it. In what follows, I will show how this gender construct maps up with the ancient use of this construct in Stoic physics, and Stoic logic.

It’s a bit like traditional societies where
the family unit inherits the surname and clan
membership (inherit the social interface) from
the male IS-A line whilst the feminine turns up
with the dowry of ten cows (HAS-A).

One use of the IS-A and HAS-A construct in computer science is in the design of Object Oriented programming languages. The early OO language C++ allowed open slather multiple inheritance of entities with IS-A and HAS-A semantics. This was found to lead to bad programming practice. In the next generation of OO languages such as JAVA and C#, the languages were designed to only allow the single inheritance of IS-A semantics. Inheritance should be limited to the masculine line. For example, a Cadillac IS-A Car.  Also, a Cadillac HAS-A CD player, HAS-A engine etc. Whilst it is perfectly reasonable that the class of Cadillacs inherit the common interface of the class of Cars, it doesn’t make much sense for the Cadillac to inherit the interface of CD players or engines.

It’s a bit like traditional societies where the family unit inherits the surname and clan membership (inherit the social interface) from the male IS-A line whilst the feminine turns up with the dowry of ten cows (HAS-A). I find that fascinating but will not dwell on it. This is good programming practice in OO!

In quantum mechanics the famous BELL experiment demonstrated that, at the micro level there are no hidden variables, no intrinsic attributes. Attributes are only accidental and have no place in universal science. What matters is the qualification in terms of the universal IS-A and HAS-A qualifications. Quantum mechanics based on is-A and HAS-A quantum states is the way to go. I will be developing this theme in later posts and in a paper I am writing

A computer illustration of gender would be the placeholder-value dichotomy. Consider a standard 32 bit computer. The computer would have 4 gigabytes of addressable memory. Each of the 32 bit memory locations can store a value ranging from zero to 4 “gig”. From an attribute perspective, this computer is a cruncher of 32 bit numbers and it is hard to understand how it works. However, ignoring the specificity of the numbers, one can look at a computer as being organised along gender lines.  A placeholder for a value can be thought of as feminine, and the value contained as masculine. Consider now the contents of a general purpose register in the computer. What is the gender of the number contained in the register?  From the register point of view, the number is a contained value, and hence masculine. However, this number could also be interpreted as a pointer to a memory placeholder, and hence be interpreted as feminine. Is it a pointer or a value? Is it feminine or masculine? In actual fact, without knowing the complete context, there is no way of telling the difference. The gender status of the general purpose register could be said to “be in superposition.” Nevertheless, despite the fleeting nature of gender when viewed by a third party, we do now know that a computer is a system involving the dynamic organisation of value and placeholder semantics. However, this gender structure is extremely shallow in computers for this somewhat desperate example to get the reader beginning to seriously grapple with the gender concept. This is more an allegory than an example.

In summary, the gender construct provides an alternative to attribute based semantics. Gender semantics provide a qualitative alternative to the traditional quantitative approach. Of course, entities typed as having a single masculine or feminine gender are too ephemeral to be considered as discernable entities. However, the situation changes in the case of entities with mixed gender. Rather than considering gendered monads as the building block of the science, consider dyads where the each end of the dyad is simply gender typed as masculine or feminine. This leads to four possible binary gendered dyads MF, FM, FF, and MM.

Because gender is an attribute free construct, it is not restricted to the attribute specificity of any particular problem domain. It is a truly universal construct and can literally apply to any problem domain whatever.  Of particular interest in this paper is to associate gender with logic. My overall strategy is to exploit this universal gender logic as the logical foundation for physics.  The proof of the pudding will be to show how this foundational logic naturally leads to a generative scheme that enumerates the elementary particles of a logical physical reality. The approach is generic and independent on any specific attribute system. The predicted elementary articles would apply to any phenomenal reality as long as it is “logical.”

What are the logical properties of gender? In this quest one is immediately led to Aristotles Term Logic, the Syllogistic. The formal structure of the syllogism is quite simple. Each syllogism is made up of three terms, a Major, a Minor, and a Conclusion. There are four elemental forms called terms. It is not difficult to discern the implicit gender typing in this syllogistic system. Each term is binary typed. Aristotle doesn’t use a masculine-feminine dichotomy but a Distributed-Undistributed dichotomy. A subject or predicate is either Distributed or Undistributed. Thus the four possible term types are typed as DU, YD, UU, and DD.  The textbook make valiant attempts to explain whether a subject or predicate is distributed or undistributed or not. The best way is to simply see the distributed subject or predicate as expressing IS-A semantics and the undistributed expressing HAS-A semantics. In other words the distributed corresponds to masculine typing and the undistributed to feminine.

For a rapid refresh of syllogistic logic in this context, I recommend that the reader spend a few minutes with my online syllogistic machine.

However, the logical platform that we need to generate the elementary constituents will not be Aristotle’s Syllogistic logic but rather the closely related Stoic logical system that came later.  

[1] In previous work I explained how a weak version of the left and right side paradigms can be found in Heaviside’s Operational Calculus. On the left “time domain” side can be found time series and complicated calculus of differential equations. On the right “frequency domain” side can be found a simple algebra of functions of a complex variable calculable by Laplace Transforms.  Note that the Laplace transform F(s) of a continuous function f(t) has the “holiographic” mathematical property that given a finite sample of F(s), no matter how small, the rest of F(s) can be perfectly reconstructed.

Bilateral Science

This post is working towards a paper I will call Logic Driven Physics. At the moment, I believe that I am the only person in the world writing this story of how the science of the Stoics can be reverse engineered to provide a new, alternative take on physics, logic, and mathematics.

In this post, I consider physical reality as a system. I take a leaf out of system science where there is not one paradigm for understanding a system but two. I argue that the foundations of science, including physics and mathematics, must be bilateral. System science demands two takes on reality. One take is diachronic in nature, the other synchronic. In system science, the diachronic side employs ordinary calculus and studies time series whilst the synchronic side employs the operational calculus pioneered by Heaviside and sees its reality as having a “holographic” flavour nowadays in Laplace and Fourier transforms.

Heaviside photo

Heaviside pioneered
Operational Calculus

I sometimes like informally to refer to this dichotomy between the diachronic and synchronic as expressing “left side” and “right side” rationality respectively. Thus, one can imagine this bilateral architecture as two diametrically opposed but complementary hemispheres of a metaphorical epistemological brain.

Aristotle was the first to remark on the epistemological dichotomy of knowledge. He placed the traditional science on one side charactering them as all studying objects that have a determined genus. On the other side he placed an entirely different kind of science that was characterised by studying entities with completely undetermined genus. The latter science became known as metaphysics which, to Aristotle, was the science of Being, pure otology. Writing about metaphysics, Kant once bemoaned:

It seems almost ridiculous, while every other science is continually advancing, that in this, which pretends to be Wisdom incarnate, for whose oracle every one inquires, we should constantly move round the same spot, without gaining a single step. (Kant, 1781)

The same thing can be said in modern times with the plight of metaphysics now in disarray where metaphysics is often demeaned, even ridiculed by many scientists. The objective of this post is to correct the slide of metaphysics into scientific oblivion. My first step is to demystify the subject by citing the non-diachronic approach of Operational Calculus as an example of what I call weak metaphysics. According to my formulation, strong metaphysics must be strongly synchronous. This demands that all pertinent players must be simultaneously present in any whole. The Operational Calculus can represent a simple system as a whole. That is its speciality. However, these kinds of systems are made up of objects only. There are no subjects present in the synchrony. Strong metaphysics, as we shall see, demands that not only must all objects be present but also the subject.

A characteristic of weak metaphysics is that the relationship between the diachronic and the synchronic is deterministic. For example, for the relationship between calculus (diachronic) and the operational calculus (synchronic) can be actually calculated exactly by Laplace transforms. In strong metaphysics, an exact calculation is impossible—such relationships can only be known in terms of dispositions, not coordinates and determined quantities.

Despite the lack of individual subject, weak metaphysics such as Operational calculus does illustrate a number of important characteristics of a strongly metaphysical right side science. Of crucial importance is Aristotle’s original characterisation of metaphysics. Unlike the world of calculus, the objects that make up the world of Operational Calculus all have undetermined genus with respect to each other. In the diachronic domain, a simple system is made up of a conglomerate of entities of differing genus, such as inputs, outputs, and system behaviours. In the synchronic domain all such categorical distinctions vanish: all entities are represented in exactly the same way as functions of a complex variable. Using a term borrowed from Computer Science, one can say that all the entities in the synchronic domain are first class. Aristotle’s undetermined genus characterisation becomes a demand that all entities in the system must be first class. Operational Calculus also demonstrates another common characteristic of right side methodology. The first class entities form an algebra. All of the complicated operations in the diachronic domain can be expressed in this algebra providing great simplification.

Another weak metaphysics example is Geometric Algebra (GA). which provides an operational alternative to the traditional matrix and tensor dominated approach of linear algebra. In GA all entities are first class where tensors, matrices and vectors give way to the same kind of entity. Everything in GA becomes a geometric entity. Like in OC, the geometric entities form a simple algebra where, in the case of GA, the role of Grassmann’s geometric product is paramount. The work of Hongbo Li highlights this key aspect of this operational methodology (Li, 2008). Li applies the conformal aspects of GA methodology to provide remarkably simple automated proofs of geometric theorems. A key construct in his algorithms is to privilege as much as possible multiplicative operations at the expense of the additive. To Li, more additive operations mean more algebraic clutter and leads to what he calls mid-term-swell. On the other hand, more of the multiplicative means the retention of geometric meaning and results in great simplification. Li clearly demonstrates how automated proofs and geometric computation in general can be greatly simplified using his approach. With more traditional linear algebra and brute force Clifford algebras the resulting mid-term-swell can be so enormous that solutions become, at best, purely notional. Another key term emerging from Li’s work is the purely multiplicative polynomial, the monomial. The monomial expresses pure geometric semantics based on multiplication, free of additive algebraic clutter. In many cases, Li’s methodology resulted in expressing geometric concepts that distilled down to monomials leading to spectacularly simple solutions free of the dreaded mid-term-swell phenomenon that afflicts non-operational methodology. As will be seen further on, the monomial construct will turn out to be of fundamental importance in this project.

In passing, one should note that the modern formulators of GA such as David Hestenes as well as Li consistently claim GA to be the universal algebra of physics and mathematics (Hestenes, 1988). I concur with this appreciation of GA with the proviso of introducing a number of important ingredients reported in this post.

There is one other example of a weak metaphysics methodology that I will be examining in more detail further on. It might seem surprising that I put forward Gödel’s work on the Completeness Theorem and Incompleteness Theorems as such an example. His work is important for this project as it brings into play the logical dimension of metaphysics. moreover, the dichotomy between what is true and, more fundamentally, what is the truth. Of great significance is the fact that Gödel’s work is not mere metaphysical speculation as it takes place in the full glare of an ingenious mathematical formalism. More of that later.

Contribution of the Stoics

Operational Calculus and Geometric Algebra provide clear examples of operational methodology. They illustrate an important aspect of metaphysics in the sense of the first classness of the fundamental entities. However, they do not embrace the most fundamental aspect of including not just a science of object nut also a science of subject. In order to start getting a grasp of what is meant, I turn back to the philosophical terrain of Hellenistic times. The bilateral perspective that I am trying to explain, can be seen in the schism between the Epicurean and Stoic schools of thought of that time.

The diachronic left side take was advocated by the Epicureans. The Epicureans were atomists, and believed in a materialist, deterministic world view that is not incompatible with the view of traditional modern science. The exception to absolute determinism was the famous Epicurean Swerve construct whereby, according to the Epicurean doctrine, every now and then atoms would imperceptibly deviate from a strictly deterministic trajectory. In this way, the unstructured primordial universe somehow micro-swerved to evolve to the state it is today. In the broad sweep of the history of ideas, I see the Epicureans and their atomist forebears as early exponents of the left side, diachronic take on reality.

Of central interest in this post are the much less understood early exponents of right side non-diachronic reality. Here, I am talking about the implacable foes of the Epicureans, the Stoics The alternative right side approach, exemplified by the Stoics, concentrates on studying the world in between the a priori and the a posteriori, the world that exists now relative to the organism in question. For the Stoics, only corporeal bodies with extension exist. Only what exists can act upon and be acted upon. Objective reality is sandwiched between the a priori and the a posteriori. To the Stoics, things in the past or in the future do not exist. It is only what exists now, relative to the organism in question. Heroes of the Now, the Stoics had no fear of anything in the past or the future; as such, things simply do not exist.

As Hahm remarks “For half a millennium Stoicism was very likely the most widely accepted worldview in the Western world.” (Hahm, 1977) However, it was the world view of the diametrically opposed Epicureans that best corresponds to the present day analytic, diachronic world view of our time, not that of the Stoics. Moreover, Stoic physics, according to my characterisation, is not physics as the moderns understand it but metaphysics. As such, their perspective on reality should be operational. This is indeed the case as Stoic physics ticks all the boxes in providing an operational perspective on reality. First of all, Stoic reality is articulated in terms of first class entities according to the mantra: everything that exists is a material body. For the Stoics, the property of an entity was also an entity in its own right thus guaranteeing that entities are first class. Thus in Stoic physics, properties are also material bodies. As for the entities forming an algebra, at least the Stoics identified the letters of the algebra in borrowing the four primordial letter alphabet of Empedocles. This necessarily leads to acceptance of the ancient four-element theory of matter where each primordial element corresponds to one of Empedocles’ four “root” letters.

The Stoics also borrowed from Heraclitus. Heraclitus saw everything in terms of oppositions. Each of the four elements expressed a primordial tension between opposite poles of an opposition. These elements were called Air, Water, Earth, and Fire. Air represented an expansive tension. Water a contractive tension corresponding to the images evoked by such naming. Earth would (or should, according to me) have been seen as an unsigned tension between two different extensions. Earth would have been seen as an unsigned tension between two different (but indistinguishable) singularities. Physical reality for Heraclitus could thus be interpreted as the interplay of these four primordial tensions. Heraclitus saw these primordial tensions as four instances of one single even more primordial tension called pneuma. Thus, the four element theory became a five element theory of sorts.

Category Theory and the Five Morphisms

To modern eyes, the ancient four element theory might seem like abstract nonsense. However there is a branch of mathematics that sometimes actually prides itself on its “Abstract Nonsense,” viz. Category Theory. Category Theory, despite being encased in a diachronic axiomatic framework, also reveals operational aspirations. Its first classness is expressed in the mantra: Everything is a morphism. Morphisms can be represented by arrows and so Category Theory sees its reality in terms of dyads, not monads as does straight pure and simple Set Theory. Category Theory rediscovers Heraclitus’s four kinds of tension in terms of four distinct kinds of morphism. Instead of Air, Water, Earth, and Fire, Category comes up with four kinds of morphism, the epimorphism, monomorphism, bimorphism, and isomorphism. In Set Theory these morphisms become functions. For functions, there is no difference between bimorphisms and isomorphisms. Note also the “expansive” nature of an epi, the “contractive” nature of a mono, and that the inverse of a bi or iso is a bi or iso, much as Heraclitus would have expected.

The vocation of Category Theory is to study mathematical structures which are common to all mathematics. Thus one could say that that these four kinds are morphisms constitute the stuff that mathematics is “made of.” Note also that there is an even more primordial morphism in Category Theory than these four, the natural transformation. Saunders Mac Lane, cofounder of Category Theory, once stated that he invented Category Theory in order to study natural transformations. Natural transformations take up the fifth spot in a “five element theory of mathematics.”

Stoic Logic

The Stoics embraced Heraclitus’s theory of the five elements and the primordial tensions they convey and incorporated it as the basis for their physics. The Stoics claimed that their philosophical system included physics together with logic and ethics to make up a harmonious whole. However, as de Lacy back in 1945 commented:

One of the many paradoxes associated with Stoicism is the puzzling circumstance that although the Stoics themselves claimed that their philosophy was a perfectly unified whole – so well unified indeed that its various parts could not be separated from one another, and the change of a single item would disrupt the whole system, yet the opponents of Stoicism, even in ancient times, regarded the Stoic philosophy as a mass of inconsistent and incompatible elements. Since much of our information about Stoicism comes from hostile sources, it is much easier for the modern investigator to find the inconsistencies of Stoicism than its unity. In recent years there have been a number of studies attempting to find the unifying element, but the problem is by no means solved. (de Lacy, 1945)

The situation hasn’t advanced much since then. In this post based on previous work, I provide the unifying element for the Stoic system. For the moment, I will simply point out the structural similarities between Stoic physics and Stoic logic.

Stoic logic in its entirety covered a vast range of subject matter ranging from rhetoric to dialectics including many subjects that would not be regarded as logic from a modern perspective. However, for the purposes of this post we need only consider the core logical system. For the Stoics, rational reality was subject to the logical principles of the Logos L. The Stoic interpretation of the Logos L was in the form of their system Ls based on the five indemonstrables, considered in detail later. A simplistic interpretation of Stoic Logic Ls is to see it as the first historical example of the propositional calculus. In other words it expresses the zero order logic of particulars. In later work, I intend to show further on that Ls can be thought of as a first order logic with powerful spacetime-like geometric semantics Gs.

However, for the moment we must be content with a cursory description of how each of the five indemonstrables map to the corresponding element of the Stoic-Heraclitus physics system Ps.Thus, the question is: how does the Stoic system unite physics with logic? More precisely, how does Stoic logic Ls based on the five indemonstrables relate to the Stoic five element theory of substance Ps? The relationship Ls Ps has already been reported from several different perspectives in previous papers. The essence of the relationship is illustrated in Figure 1.

Figure 1 Illustrating the Stoics relationship Ls Ps
and the corresponding Heraclitus diagrams.

Stoic physics adopted the four element system of Empedocoles, including the gender typing. The gender construct is explained in my previous works and will be further explained further on in this work. I technically refer to it is ontological gender. Gender is the key to understanding how all of this fits together. There is a learning curve for appreciating the full extent and subtleties of the gender construct the most subtle of all distinctions. For the moment, think of the masculine as expressing pure form. The purest and most primordial expression of form is the singularity. Expressed linguistically, the masculine is pure “is-a.” On the other side of the gender divide is the feminine which, in isolation, can be thought of as pure formless extension. Linguistically, the feminine is pure “has-a.” The gender calculus (yes it does form a calculus) expresses the dialects of the is-a, and has-relationship. As I said, this is the most subtle of all distinction. It is also the most fundamental.

To be expanded upon….


Kant, I., 1781. The Critique of Pure Reason. s.l.:The Project Gutenberg EBook: http://www.gutenberg.org/files/4280/4280-h/4280-h.htm.

Moore, D. J. H., 2012. The First Science and the Generic Code. Parmenidean Press. 450 Pages
Moore, D. J. H., 2013a. Now Machines
Moore, D. J. H., 2013b
The Whole Thing is a (Now) Number
Moore, D. J. H., 2013d. Logic Driven Physics: How Nature’s genetic code predicts the Standard Model.
Moore, D. J. H., 2013. The Universal Geometric Algebra of Nature: Realising Leibniz’s Dream
Moore, D. J. H., 2013. Generic Model versus Standard Model Interactive Database. [Online Database Application]

Reverse Engineering the Genetic Code

The post is a slightly edited version of a submission I recently made for Challenge prize competion. I didn’t win it but he submission provides a reasonable and short overview of my project.


genetic code image

Reverse Engineering the Genetic Code

understanding the universal technology platform of Nature

Executive Summary

My proposed platform technology for advancing the life sciences is none other than the genetic code itself. Even though all life forms evolve over time the universal language that codes them remains virtually unchanged over billions of years. If one wants to find a fundamental platform for exploring and explaining life, the answer is already there in this universal language of Nature. The Central Dogma of biochemistry infers that the genetic code is a mere transcription language. My project challenges the dogma with the central claim that the four letters of the genetic code express logico-geometric, spacetime-like semantics. In fact, the four letters (A,T,G,C} express timelike, lightlike, spacelike, and singular-like semantics respectively. A central aim is to reverse engineer the code from first principles. In so doing, the code becomes the operational calculus for explaining the organisational principles of life.

The broad idea is not new and was envisaged by Leibniz over three centuries ago. In a famous passage, he sketched out his dream of developing a geometric algebra without number based on only a few letters that would simply and non-abstractly explain the form of the natural things of Nature. One could say that Leibniz anticipated the genetic code. However, his vision went much further than that. He claimed that the resulting algebra would have logico-geometric semantics and so his vision becomes quite revolutionary. Even more revolutionary still, he claimed that the same geometric algebra would explain, not just the animate, but also the inanimate. We now know that the organising generic material of biological organisms is distinct from the functional material of the organism. In the inanimate case of an “organism” like our universe, there appears to be no observable distinction between organising substance and the organised. Thus, if Leibniz’s vision is valid for the inanimate, then the elementary particles of Particle Physics should be directly and simply explained in terms of the four-letter algebra of the genetic code—now playing the role of a truly universal generic code. For inanimates like our universe, the organising material and the organised are the same stuff.

My project involves making Leibniz’s vision tractable in developing his Analysis Situs geometry without number in order to provide the logico-geometric semantics of the genetic code. My ideas have rapidly matured over the past year resulting in the publication of one book and the drafts of four long papers on the subject. The third “Leibniz paper” is the most pivotal. The rough draft of the fourth paper shows how the same genetic code organisation predicts the Standard Model of Particle Physics and even surpassing it. Because of its non-empirical nature, my Leibniz style methodology can predict not only the explicitly measurable particles but also the implicit, which may be impossible to observe empirically.

The Big Picture

This project takes a leaf from nature and provides a bilateral approach to science. There are two takes on Nature, requiring two “hemispheres” of knowledge. I refer to present day sciences as left side sciences. Left side sciences specialise in explaining the a posteriori in terms of the a priori. The empirical sciences harvest data and develop compatible theories to predict future outcomes. Axiomatic mathematics works deductively from a priori axioms to prove a posterior theorems.

The alternative right side approach, exemplified by the Stoics, concentrates on studying the world in between the a priori and the a posteriori, the world that exists now¾relative to the organism in question. For the Stoics, only corporeal bodies with extension exist. Only what exists can act upon and be acted upon. Thus, the Stoic perspective is that objective reality is sandwiched between the a priori and the a posteriori. The perspective is comparable to Leibniz, albeit more materialist.

Objective reality of an organism is anchored in the immediacy of its Nowness. I call machines based upon this principle Now Machines. I claim that all animates and inanimates are based on the Now Machine principle. The underlying principle is that the organism must not be subject to any extrinsic a priori principle. Borrowing a term from Computer Science, I call the principle First Classness (FC). The dominating principle of Now Machines is the non-violation of FC. The logic involved is similar to the Liar Paradox construct that Gödel used to prove that (left side) mathematics is incomplete. In right side mathematics, it becomes the organisational, self-justifying principle of Now Machines.

The mathematics of corporeal bodies acting and being acted upon leads to a particular kind of geometry with direct historic roots to Leibniz. As succinctly explained by Hongbo Li:

Co-inventor of calculus, the great mathematician G. Leibniz, once dreamed of having a geometric calculus dealing directly with geometric objects rather than with sequences of numbers. His dream is to have an algebra that is so close to geometry that every expression in it has a clear geometric meaning of being either a geometric object or a geometric relation between geometric objects, that the algebraic manipulations among the expressions, such as addition, subtraction, multiplication and division, correspond to geometric transformations. Such an algebra, if exists, is rightly called geometric algebra, and its elements called geometric numbers. (Li, 2008)

Li together with David Hestenes and other exponents claim that Geometric Algebra (GA) is the universal language of mathematics and science and so realises Leibniz’s dream. I consider their claim premature as it ignores two vital aspects of Leibniz’s vision. The claim ignores the truly universal genetic code of Nature “based only on a few letters.” In addition, although GA is not based on coordinates, it is still relies on ordinary numbers under the hood. Such a number scheme imposes absolute extrinsic ordering relationships from outside the system and so violates FC. I propose a solution founded on the ancient construct of ontological gender. The pure feminine gender entity is considered to have an attribute, albeit undetermined. The pure masculine gender type is that attribute as an entity in its own right. Thus two entities, the feminine has an attribute, the masculine is that attribute. The feminine corresponds to pure geometric extension, the masculine to geometric singularity. These are the two building blocks of Now Machines. With gender, the genetic code letters {A,T,G,C} can be expressed by the four binary genders {MF,FF,FM,MM}. Viewed from outside the system, genders are indistinguishable and so appear to be in superposition opening the way to Quantum Mechanics interpretations. Like Doctor Who’s Tardis on TV, a Now Machine appears bigger on the highly tuned and coded inside than the amorphous mass of superposition seen from the outside. The algebra of gender can replace the algebra of ordered numbers to provide a true “geometry without number.” The gendered version of GA articulates the dynamic geometric semantics of the genetic code and provides the final realisation of Leibniz’s dream.


New Science: Nature abounds with bilateral structures and asymmetries that remain unexplained by present day science. For example, why are all biologically produced L-amino acids left handed? In the inanimate realm, why are there no right-handed neutrinos? In order to address these kinds of question, a new kind of science is necessary. Not only must science explain bilateralism in Nature, but also the science must itself take on a bilateral epistemological architecture. Like the biological brain, science must develop two distinct but complementary takes on reality. In modern times, there has only been one “left side” science. This project unearths the complementary “right side.”

Overcoming Barriers: Nature herself has technological differences but no ontological barriers. The new right side science I propose unifies the science of the inanimate with the animate. “Life is everywhere,” so to speak.

Public Impact: Left side science got off the ground with Leibniz and Newton’s discovery of calculus, the ultimate public impact of which is incalculable. Right side science starts with the discovery of how the genetic code harbours the geometric calculus and semantics of life systems ranging from the animate to the inanimate. The public impact would surely be comparable.

Science Deficits: Psychologists have discovered that a patient with only a fully functional left-brain may exhibit bizarre behaviour like only eating food on the right side of the plate. They call it hemineglect. I claim that left side mathematics also suffers the same “cognitive deficit. The phenomenon can be traced to left side geometry, which only needs timelike and spacelike lines to work. In other words, the geometry only uses the two-letter alphabet {A,G}. It only fires on two cylinders! The right side geometry is based on the genetic code letters {A,T,G,C} and so, like its right side hemisphere biological counterpart, is cognizant of both sides of a bilateral world. Thus in some cases better instrument technology in left side science will be pointless because of the hemineglect blind spot of left side mathematics—and the mathematician will never know.

Both right side science and its right brain counterpart suffer a different kind of deficit. They are mute. However, although communication to outside the system is impossible, the right side can communicate with itself. That is what the universal language of Nature is for.


Present orthodoxy sees living organisms as results of evolution. Thus, man is the product of millions of years of genetic accidents. He is a genetic freak. The alternative right side science view is that the very essence of life is present from the very beginning. As foreseen by Leibniz, there is a universal algebra articulating the same life essence shared by all beings, ranging from the neutrino, the quark, the amoeba, through to man. In this context, man emerges from a universal principle, a much more noble scenario than being a genetic freak.

Some novel points:

  • Science should be bilateral like the two brain hemispheres.
  • Everything from the ground up can be explained in terms of gender
  • The letters{A,T,G,C} of the genetic code correspond to the binary genders {MF,FF,FM,MM}
  • The organisational principle of life is based on a form of the Liars Paradox
  • Leibniz was right on the money. The Stoics also had the right mind set.

Risk and Challenges

If this kind of science were to be fundamentally intractable, as many claim, then the project would be doomed to failure. After many decades of effort, my four draft papers demonstrate tractability and hence remove that risk.

The challenge of developing the new mathematics required is quite daunting and I need help. One sub-project, possibly even Nobel Prize material, is to explain the so-called degeneracy of the genetic code at least in the biological realm. My approach is that each codon codes an elementary geometric form. According to my theory, the start codon ATG expresses the Lorentz semantics of Special Relativity where the codon is made up of a single timelike A, lightlike T, and spacelike G form. Such a composite geometric form can be considered homogeneous and so satisfy FC. Hence, no need for degeneracy. The only other non-degenerate codon is TGG. TGG codes the semantics of a de Sitter space, which has known General Relativity interpretations and is homogenous. I claim that, for homogeneity compliance, all other elementary forms must be appended with extra dimensions. Hence the degeneracy for all codons



Li, H., 2008. Invariant Algebras and Geometric Reasoning. Singapore: World Scientific Publishing.
Moore, D. J. H., 2012. The First Science and the Generic Code. Parmenidean Press. 450 Pages
Moore, D. J. H., 2013a. Now Machines
Moore, D. J. H., 2013b The Whole Thing is a (Now) Number
Moore, D. J. H., 2013d. Logic Driven Physics: How Nature’s genetic code predicts the Standard Model.
Moore, D. J. H., 2013. The Universal Geometric Algebra of Nature: Realising Leibniz’s Dream
Moore, D. J. H., 2013. Generic Model versus Standard Model Interactive Database. [Online Database Application]


Science is a Belief System

I’m talking about left and right hemispheres. Sometimes it’s not sure whether I’m talking about the biological brain, the mind, or the epistemological divide between the empirical sciences and the “other” way of thinking. I’ve given up making the distinction. I’d rather be hung for a tiger than a sheep.  Well, the whole Cosmos is split into left and right sides and the cleavage line goes right down the middle of the author’s skull.  I must point out that the same situation applies to you as well. No one is spared in being chopped in half. In the post after this one, we see that even God is split in two.
Traditional Sciences form a Belief System
The left hemisphere bathes in abstract reality. This great bubble of floating rationality works from propositions, which have truth-values. As such, each proposition expresses a belief. The left side subject believes in propositions, which have ‘true’ truth-values, and disbelieves propositions, which are deemed to have false truth-values. A considerable source of angst for the left hemisphere is figuring out what propositions to believe in and which to disbelieve. The source of angst comes from the fact that the whole rational apparatus is suspended mid-air in a world of abstraction. This abstract bubble of rationality has no logically expressible relationship to the non-abstract world, whatever that might be. Like Descartes contemplating his own thinking, this miserable isolated left hemisphere eventually arrives at its core belief: it is true that I exist because I’m thinking about it. Of course, this Cartesian proposition is only a belief. Like all the uni-directional propositions that populate the left hemisphere, by its very uni-directional nature the proposition has a truth-value which only may be true and equally may be false. The only critical faculty available to left side reasoning is the demand for internal logical coherence of its belief system.

The end result is that left side thinking can be very sharp for detecting the most subtle logical irregularities, contradictions and variances from the current prevailing belief system. This is the strong point of left side reasoning. The weak point is that the resulting belief system can creep so far away from common sense that it becomes quite whacky, fundamentalist religious belief system and political belief systems can even become very dangerous and destructive. Modern sciences, exploiting the analytical clarity of left side, try to avoid creeping into insanity by peer review and attempting informal common sense interpretations of empirical data.

The corpus of knowledge making up present day sciences makes up a gigantic belief system. Karl Popper cottoned on to this fact by providing his well-known criterion for a belief system. A belief system is one where no proposition in the system is absolutely and definitively true. For Popper, a belief system was one where every proposition that is provisionally true but may be “falsifiable”. For this to be possible, all propositions must enjoy the rational status of possessing a truth value: hence, providing the possibility of being either true or false.

Karl Popper effectively declared that modern science, according to his falsifiability criterion, was fundamentally a belief system. He then went on to use the criterion in the reverse sense: If any pretender to scientific knowledge was not a belief system then it was “unscientific.”

For traditional science, the minimal requirement for an assertion be acceptable as scientific is that it be either believable or unbelievable. This requires that the assertion can be stated as a proposition possessing a truth value thus allowing a believable object of belief (true truth-value) or disbelief (false truth-value). Left side science is intimately wedded to a certain brand of logic which assumes the Law of the Excluded Middle. There is no middle way. Propositions are believed either true or false, in science. There is no “cannot be determined” or “not applicable here” clause in the logic of the empirical sciences. If a proposition should indeed offer such “third option” possibilities then it cannot be an object of belief or disbelief and so would not be accepted as being potentially scientific. The validity of an empirical science proposition must be black or white, there are no greys.
Traditional science is based on abstraction. A fundamental characteristic of abstract reasoning is that it does not demand that objects exist or not. This is seen as its power. A favourite topic for abstract reasoning is the proposition “God exists.” Is the proposition true or false? The same question can be asked about unicorns and gravitons. Do they exist? According to the Law of the Excluded Middle, the answer must be true or false. The basic assumption of abstract reasoning is that existence is an attribute. Something existing or not existing is like something having mass or being massless. Existence is a mere attribute that some things have at a particular point in time. Unicorns will never have existence because they are fictional. Unicorns do not exist and never will exist. Socrates also does not exist, but for a different reason: he is dead. The Judeo-Christian god is an entity which possesses this existence attribute. God exists. In the form of his son, he even once existed in the flesh. What is more, he can return in the flesh at any time. The Judeo-Christian god is distinct from any other god by its existence attribute. Grace to this attribute, the citizen is faced with a stark choice.  The citizen, being an abstract thinker, must respect the Law of the Excluded Middle. He can believe that god exists and so enter into the communion of believers. Alternatively, he can believe the contrary: God does not exist, he declares. He thus enters into the club of the Atheists. Theist or atheist? That is the question. It is in this way that the god fearing believer and the god hating atheist join hands in a common goal. They are all people that believe that the god question is a reasonable question with a clear and precise answer. They are all creatures driven by belief. Of course there might be a third option, that of the agnostic. However, the agnostic must climb to even more illustrious heights and start musing over whether the Law of the Excluded Middle is valid or not, and why.

Not all people are creatures of belief. This is the case for Allah and the Hindu gods. In the case of the secular Islamic world, for example, there are no atheists as there are in the secular Judeo-Christian world. No one, not even the most devout Muslim, believes in Allah and so no one can disbelieve in Allah. Allah is not an object of belief as Allah is beyond the true and the false. With Allah belief is inconsequential, what matters is faith. Allah is determined by the faith of the individual. If you hold such faith then Allah is your god. If you are secular, not only do you have no god, you have no concept of god. There is no debate. There cannot be any debate between the faithful and the infidel, just a different state of being based on faith or the lack of it. The difference between belief and faith can be difficult for Westerners to comprehend.

There is a big difference between belief and faith. For example, someone can believe in fairies but it is difficult to imagine having faith in fairies. In Christianity, belief comes first and faith second. It is quite possible for a Christian to have a crisis in faith and even lose the faith. Nevertheless, the Christian will still believe in God.

The Christian god is qualifiable by a proposition that satisfies the Law of the Excluded Middle. The proposition “God exists” thus can be considered as a scientific hypothesis. This is where Popper steps in and adds an extra requirement for a proposition to be acceptable as a scientific hypothesis, the proposition must be falsifiable. The general consensus amongst both Judeo-Christian theists and atheists alike is that the proposition “God exists” is not falsifiable. There is no scientific experiment that could possibly refute the proposition. Thus, by Popper’s criterion, the question of whether god exists or not cannot be covered by science. Once again, the theists and atheists usually concur on this conclusion, something which underlines the unanimity of theists and atheists in Judeo-Christian culture. Theists and atheists mutually agree on everything except the particular truth value of a proposition.

However, things aren’t as simple as that. Some atheists have felt threatened by theists who have started to pedal a fundamentalist view of creation. To restore the balance against the inroads that the Creationists are having into the education system, the atheists have resurrected some nineteenth century science to act as an alternative beacon of inspiration for our youth. They call this alternative to Creationism, Darwinism. The atheists peddle the Darwinist message that every human being on this planet is the end result of a long series of random genetic mutations leading to what we are today. By selling Darwin tee shirts over the web and promoting this inspirational message across the media, the atheists hope to win the day.

The battle between the Creationists and the New Darwinists seems to be essentially peculiar to the US. What is of concern in this section is the scientific status of Darwin’s Theory of Evolution. Firstly, one should note that the basic epistemological basis of the Theory of Evolution was due to the Epicureans of ancient Greek and so preceded Darwin by several thousand years. Despite a similarity of their world scientific outlook, the Epicureans differed from the New Darwinians by their views on how to enjoy life. The New Darwinian advocates getting meaningful pleasure out of life by getting excited about new pictures posted on the Hubble telescope web-site. The Epicureans took a different tack. Rather than pleasure being a mere by-product of certain kinds of scientific pursuit, they turned the pursuit of pleasure into the central object of science itself. They argued that one of the worst obstacles to leading a happy pleasurable life was fear of the gods. This lead to the Epicureans taking the theological position that the gods were distant from humans and totally uninterested in human affairs. Of particular interest was their scientific outlook. The Epicureans, although not empirically minded, held a similar philosophical outlook to traditional sciences. They were strict materialists, atomists and determinists. The whole world was in the vice of a strict determinism of cause and effect. But, like modern physics, there was an exception to this draconian determinism. Epicure called it the Swerve. Atoms moved and interacted with each other in a totally deterministic way but every now and then an atom would execute an imperceptible, totally random “swerve”. Epicure exploited this notion to develop his Swerve Theory of the universe. At the beginning of the cosmic cycle, the world is non-structured: All atoms were falling down in straight vertical lines, according to Epicure. After an immensely long time, because of the accumulated random swerves of the otherwise deterministic atoms, the universe micro swerved into the way it is today,

Amazingly, this picture is no different in principle to that of modern science. The random beginnings were a bit different but the micro swerving into the world of today is the same belief. Since Epicure’s time Swerve Theory has come a long way. The random swerves of atoms has been confirmed and even quantified. Nowadays the Epicurean Swerve Theory is known as Heisenberg’s Uncertainty Principle and is explained in the Wave Equation.

Swerve Theory has been applied to the biological realm where it manifests itself as genetic mutations. Just as collections of atoms micro-swerved to produce the first single celled living creature, further random swerving eventually lead to the animals that we have become today. It’s all a product of Epicurean Swerve Theory.

Darwinism also adds in the survival of the fitness paradigm as an embellishment of Epicurean Swerve Theory. It is thus claimed that genetic swerving is not a completely random process as some swerves are more successful as others and hence are directed by success. The successful swerves then go to propagate other successful swerves. The end result is that only the fittest survive. In fact, the survival of the fitness paradigm is a huge red herring. It’s about as meaningful as saying that the survivors of a car crash are the fittest compared to those that perished. The paradigm is a simple tautology. Who survive are the fittest, who are the fittest are the survivors. The semantics are the same, only the labels change. All up, the survival of the fitness paradigm adds nothing to elementary Epicurean Swerve Theory. To name the survivors as being the fittest is just a change of terminology. We are all the “fittest,” we are all the last men standing; we are all the survivors of a trillion times a trillion Epicurean Swerves. And that is the way we came to be the way we are today, believe it or not, says the Theory.

It would seem that Epicurean Swerve Theory and its modern biological successor in Darwinism are capable of being expressed in terms of a theory that people can believe or disbelieve. Thus the theory could be taken as a traditional scientific hypothesis. However, there is no way to possibly refute the hypothesis. That things change deterministically with a random component, this is hard to refute. This is what Karl Popper himself eventually recognised. By his falsification criteria, Darwinism was unscientific! Popper initially accepted this conclusion and only later tried to worm his way out of it. Refuting that we didn’t just drift to where we are today and thus refute Darwinism is a task that even Popper can’t convincingly achieve.

Where Darwinism wins prestige is the notion that the theory explains something. It explains evolution. However, it only has descriptive not explanatory powers. It describes evolution. As we know, the evolutionary process goes in the face of what is predicted by the second law of thermodynamics where there should be a drift to increased entropy, an ineluctable drift towards thermodynamic death. However, this drift is deterministic as there is no Epicurean Swerve or Heisenberg Uncertainty in classical thermodynamic theory. In the evolving world we live in, the opposite seems to be the case. Evolution leads to an apparent decrease in entropy, a steady rise in diversity rather than a steady fall. Darwinism describes this phenomenon, but does not explain it.

We are now coming to the end of this section with the basic understanding that science is based upon falsifiable belief. As for religion, it is either based on non-falsifiable belief or faith, which is impervious to belief. In the third slot is Darwinism. It appears that Darwinism is somewhere in the domain of the Epicurean Swerve theory. Alternatively it can be taken as a non-falsifiable belief that things, particularly living things, evolve and so are in the same boat as the religions. The New Darwinians seem to prefer the latter option ad see it as a viable religion substitute, but still a religion nevertheless.

It is now time to carry out another exercise in semiotic analysis. This time we will end up with a system based on belief on the left side and a system based on something else on the right. The right side system is based on faith. This will be an exercise in theology. See the next post.

Key Phrases: Semiotic square, genetic code, generic code, DNA, start codon, left right hemispheres, the divided brain, epistemology, anti-mathematics, masculine, feminine, gender differentiation,  Generic Science. Science as a belief system.


D. J. H. Moore