Scientific theology: a new history of creation
Chapter 6: Creating the divine world
6.1: Fitting physics to theology
6.2: Clues from the ancient god
6.3: The mathematical community
6.4: Divinity to trinity
6.5: Physical evolution 1: action to energy
6.6: Physical evolution 2: energy to entropy
6.7: Quantum theory and the creation of space-time
6.8: Measurement, creation and insight
6.9: Gravitation: the network structure of space-time
6.10: Cybernetics, algorithms and selection P & NP
6.11: Representation: fundamental particles
6.12: Quantum communication: bosons connect fermions
6.13: Network QFT, QED and QCD
6.14: Corruption and death
6.15: Evil
6.16: Does the model fit the world?
6.1: Fitting physics to theology
If we are to have a divine world, we must fit physics to theology. At present theology is built on the ancient Greek picture of the physical world and physics is based on the medieval theology of a world created by an omniscient and omnipotent god. Both pictures are in need of the revision that will come from bringing them into contact with one another. In both cases the central problem is that they overlook the fact that information is not just something spiritual or metaphysical, it is a physical entity. It is represented by physical entities of all sizes and complexities from fundamental particles to the universe. The Bible was once made of ink and paper, now it may be encoded as a collection of little magnets, or electrons. Rolf Landauer: Information is a physical entity
In theology the model of god developed by Aquinas and the medieval scholastics has not changed for 800 years. If anything the attempt to understand god has been abandoned in the face of the claim that god is so far beyond our ken that we can say nothing about them. This position was strengthened by the protestant revolution that divided the Christian Churches in the sixteenth century. Theology turned away from developing the science of god and turned instead to interpretation of the Bible, which is largely a political, poetic and cultural document with very little to say to say about the reality of god or the world.
Aquinas, like most of the ancients, equated knowledge with immateriality, that is spirituality. Is there knowledge in God? Yes, he says, arguing from Aristotle’s doctrine of matter and form:
In God there exists the most perfect knowledge. To prove this, we must note that intelligent beings are distinguished from non-intelligent beings in that the latter possess only their own form; whereas the intelligent being is naturally adapted to have also the form of some other thing; for the idea of the thing known is in the knower. Hence it is manifest that the nature of a non-intelligent being is more contracted and limited; whereas the nature of intelligent beings has a greater amplitude and extension; therefore the Philosopher says (De Anima iii:8, 431b20 ) that "the soul is in a sense all things." Now the contraction of the form comes from the matter. Hence, . . . forms according as they are the more immaterial, approach more nearly to a kind of infinity. Therefore it is clear that the immateriality of a thing is the reason why it is cognitive; and according to the mode of immateriality is the mode of knowledge. Hence it is said in De Anima ii that plants do not know, because they are wholly material. But sense is cognitive because it can receive images free from matter, and the intellect is still further cognitive, because it is more separated from matter and unmixed, as said in De Anima iii. Since therefore God is in the highest degree of immateriality . . . it follows that He occupies the highest place in knowledge Aquinas, Summa: I, 14, 1: Is there knowledge in God?.
From a modern point of view, this discussion completely misses the point. It is a general metaphysical argument which does not touch on the mechanism of knowledge. Aristotle does come close when he defines the soul as the first actuality of natural body possessed of organs. Nevertheless, he felt that the universality of human intellectual knowledge required a spiritual mind. Christopher Shields: Aristotle, Christopher Shields 1996: The Active Mind of De Anima III 5
Aristotle was not to know that the organs of a living body are constructed on a microscopic molecular scale that only became visible with the advent of electron microscopes. Indeed the structure of the world continues well beyond the reach of microscopes into the atomic nucleus whose scale is typically millions of times smaller than any molecule. Every element of this structure may serve as a representative vehicle, so that the Universe has vast information carrying capacity..
In physics, the representation problem emerged with the development of quantum theory. Since Galileo's time, the importance of mathematics as the route to physics has grown. Isaac Newton used the power of mathematics to explain the role of gravitation in the motion of the Solar System. The application of mathematics in classical physics is relatively straightforward, since it is easy enough to correlate mathematical symbols with physical variables like number, time, mass, energy, momentum and so on.
The difficulty arises when we turn to quantum mechanics. Sunny Auyang writes:
According to the current standard model of elementary particle physics based on quantum field theory the fundamental ontology of the world is a set of interacting fields. Two types of field are distinguished, matter fields and interaction fields. . . .. The quanta of matter fields, called fermions, have half integral spins. The electron is a fermion; its spin quantum number is ½. . . .. The quanta of interaction fields, called bosons have integral spins. The photon is a boson. Its spin quantum number is 1. . . .. Fermion - Wikipedia, Boson - Wikipedia
There are 12 matter fields and each has an antifield. . . ..
There are four fundamental interactions. Gravity . . .. Electromagnetism . . .. The strong interaction . . .. The weak interaction . . .. Sunny Auyang: How is Quantum Field Theory Possible? page 45. Gravitation - Wikipedia, Electromagnetism - Wikipedia, Strong interaction - Wikipedia, Weak interaction - Wikipedia
If "the fundamental ontology of the world is a set of interacting fields", we must ask how are these field represented? Auyang's statement appears to imply that the fields are prior to the particles. This raises the "measurement problem" present almost since the beginning in the interpretation of quantum mechanics. Measurement in quantum mechanics - Wikipedia
Mathematicians and physicists represent their data, their thought and discoveries in "the literature", the physically observable set of marks which represent everything publicly shared by the community.
An important and very succinct piece of mathematical text in quantum mechanics is the Schrödinger equation:ih∂/∂t |ψ(t)> = H |ψ(t)>
Here H is the Hamiltonian or energy operator which transforms the ket |ψ(t)> as time goes by. As long as the system is undisturbed this transformation is considered to be deterministic. This transformation is unitary, equivalent in communication terms to a lossless (reversible) codec. |ψ(t)> is a vector in a Hilbert space formed by the superposition of the complete (possibly infinite) set of basis states of the system. Codec - Wikipedia
Following Michael Polanyi it seems appropriate to call the Hilbert space which serves as the domain of the Schrödinger equation its tacit dimension. We introduced this idea in chapter 2.1. Is there a real infinite dimensional physical object corresponding to the Hilbert space which acts as the tacit dimension of quantum mechanics?
Many physicists accept the "Copenhagen interpretation" which treats the complete set of basis basis states that contribute to a ket as real, and then postulates something like either the "collapse of the wave function" or the "many worlds model" to account for the fact that an observation reveals only one of the states that constitute the ket. My philosophical problem is that these states seem to have only an abstract mathematical or platonic existence because there is no observable representative vehicle for them. Insofar as the basis states are all orthogonal, it does not seems possible that nature can represent such an infinity of real states (particles) in a system as small as an atom, or the even smaller nucleus of an atom. Copenhagen interpretation - Wikipedia, Wave function collapse - Wikipedia, Many-worlds interpretation - Wikipedia
This difficulty leads me to think that all the information in a quantum system is embedded in the real particles rather than in an abstract wave function or field.
It took quantum theory nearly 50 years from Planck's initial discovery of the quantum in 1900 to the clarification of the interface between quantum mechanics and special relativity in the late 1940's. First came the "old quantum mechanics" set in motion with Bohr's model of the hydrogen atom. Bohr's idea failed in the face of more detailed data, but in the 1920s two solutions emerged, matrix mechanics and wave mechanics. They turned out to be equivalent and the situation was clarified when Dirac summarized the quantum story with his transformation theory and the Dirac equation. John von Neumann tidied up the mathematics by introducing abstract Hilbert space as the ideal domain of the theory. Bohr model - Wikipedia
The next step in the reconciliation quantum mechanics and special relativity initially proved difficult because it yielded infinite answers, but a system for removing the infinities known as renormalization was invented, so that now quantum electrodynamic calculations match experimental measurements accurate to parts per billion. Renormalization - Wikipedia
The appearance of infinities in attempts to interface special relativity with quantum theory and the need for renormalization point to another problem. It seems very unlikely that the infinities appearing in quantum field theory are really present in nature. Instead they suggest some problem in the mathematical approach to this work.
There seem to be two sources of this problem, both carried over from the use of continuous mathematics in classical physics. The first is that classical physics treats particles as structureless points all of whose properties arise from their relationship to spacetime through such parameters as position, momentum, energy and time. The spacetime manifold is treated as continuous and differentiable and sets of n particles with six degrees of freedom each (position and momentum) are described by phase and configuration spaces of 6n dimensions.
The second is that when point particles are endowed with field properties the self energy of these particles blows up to infinity as their size approaches zero. If such a particle is supposed to have mass or energy its energy density is also assumed to be infinite, another physically improbable feature. The classical model of the initial singularity suffers from this also: the energy of the universe is understood to be contained in a region of zero volume. We avoid this problem here by considering the initial singularity to be a quantum of action, not energy.
We might evade these distasteful consequences by considering the quantum of action to be a discrete real particle whose existence does not imply the existence of continuous space-time. Let us therefore assume that the quantum is the representation of an event or process which can be modelled as a computation. The simplest such computation is a a null operation, formally identical to an error free communication channel. The most complex lie at the limit of computability defined by the turing machine.
Heisenberg approached this problem in 1925 when he wrote: 'The present paper seeks to establish a basis for theoretical quantum mechanics founded exclusively upon relationships between quantites which in principle are observable.' The problem here, of course, is that we cannot observe the processes represented by the mathematical machinery of quantum mechanics, only those that are represented by measurable particles in classical space-time. Scientific theology, since it is founded on quantum theory, has a similar problem. We can observable the classical fixed points of god but the underlying dynamics is hidden from us. Werner Heisenberg: Quantum-theoretical re-interpretation of kinematic and mechanical relations
This invisibility does not mean that the interactions of particles are not real. As explained in section 5.12, very simple systems may be invisible because they do not have the power to do their work and explain themselves simultaneously.
Hawking and Ellis, studying the large scale structure of the universe implied by Einstein's general theory of relativity, concluded that the universe may have started from a structureless initial singularity, which, like the Christian God, is absolutely simple. This provides us with an opportunity to identify the source of the world with the Christian god. Both god and the initial singularity share just three properties, they exist, they are absolutely simple and they are the source of the universe. Hawking & Ellis: The Large Scale Structure of Space-Time
6.2: Clues from the ancient god
The idea that information is represented physically implies that real particles are the natural vehicles of information. Our starting point is the initial singularity. The god of Genesis created the universe itself by saying a series of words: "let there be light". We may understand the traditional creation as an example of intelligent design. God had a plan in mind for the universe and realized this plan outside themself, as architects and builders do.
This raises a difficulty. It is a principle in classical theology that attributes which are accidental in the created word are substantial in god. Ideas that are accidental in human minds are understood to be real in god's mind. As we will see when we come to discuss the Trinity, the Father's image of themself is understood to be the real person of the Son. So we might think god's idea of the universe is already real within god, and cannot meaningfully be duplicated outside god. Here we take this idea seriously. The divine universe we propose grows within god, not outside it. Aquinas, Summa I, 15, 1: Are there are ideas?
Although the theology proposed here is radically different from the Christian story, we can still learn a lot from the picture of god developed by Aquinas. By identifying the initial singularity implicit in general relativity with the completely simple traditional divinity of pure action, we can adapt many ideas from the old theology to the new.
Our starting point is identical to the classical god of Aquinas, absolutely simple. The end point is the god of imagination, magnificent, infinite, omniscient and omnipotent, the universe that we see. The path between these two images of god is mapped by the network model outline in chapter 5.
We have two principal avenues to understanding the physical universe: gravitation and quantum theory. General relativity describes gravitation which controls the large scale structure of the universe and so far matches the observed data perfectly. Unfortunately it is incompatible with the current form of quantum theory. Although general relativity tells us what is happening, it does not tell us why. General relativity - Wikipedia
The microscopic local properties of quantum theory lie at the beginning of the chain of explanation that leads us to a comprehensive explanation of the universe as a whole. This idea is implicit in the network model, which is symmetrical with respect to complexity. The same is true of quantum theory. Its structure remains the same whether we are dealing with two states or an infinity.
This feature enables us to move up and down the scale of complexity carrying understanding obtained at one level to other levels. The overall metaphysical foundation for this idea I call cognitive cosmology: it makes the best sense to think of the universe as a mind. From this point of view our intelligence is nothing special. Each of us is a local version of the enormous intelligence that created us and our world.Thomas Aquinas begins his Summa Theologiae with the claim that, starting from our observations of the world we can prove the existence of god. Aquinas is not so much proving that god exists as that god is not the world. Given his Christian belief that god created the world, it must have been obvious to him that the creator existed. As far as I know, he never raised the difficulty of the duplication of the world raised above. Thomas Aquinas, Summa, I, 2, 3: Does God exist
Aquinas's first way is almost identical to Aristotle's argument for the existence of a first unmoved mover. Motion, he says, is the transformation from potentiality to actuality. Aristotle holds, as an unproven axiom, that no potentiality can actualize itself, but it must be moved by an actual mover. To prevent an infinite regress which would imply that nothing can move he must therefore postulate a being that is the source of its own action: pure action with no potential whatsoever, the realization of all possibility. Aristotle: Metaphysics XII, vii, 9: 1072 b 25 sqq
Aristotle and Aquinas both conclude from their arguments that god is pure actuality (Latin actus purus Greek: energeia or entelecheia). In more modern terms, we may say god is pure dynamics. In Christian terms, god is a living god. Both Aristotle and Aquinas define life as self motion, and Christianity accepts this idea, that god is the source of its own activity. In modern physics, potential and actual energy are identically energy, freely interconvertible, as we see in the pendulum. Aristotle's axiom no longer holds.
The first thing Aquinas denies of god is complexity. Working from the idea that god is pure actuality, Aquinas concludes that god is absolutely simple, omnino simplex. Aquinas, Summa, I, 3, 7: Is God altogether simple?
This brings up against a mystical conception of god which presents another obstacle to identifying god and the world. Having proved that god exists, Thomas then goes on to discuss the nature of god. Following ancient tradition, Aquinas believes that god is so far beyond human comprehension that we cannot say what god is, only what god is not. This ancient idea is generally known as the via negativa, the way of negation. The absolute simplicity of god appears to leave nothing for our minds to get a grip on. Apophatic theology - Wikipedia
A possible solution is provided by the mathematical discovery that logically consistent dynamic systems have fixed points. Few would doubt that god, whatever it is, is logically consistent. This belief is the foundation of the Christian idea that god, in their mysterious ways, always acts for the best, although this may not be immediately apparent to us. Power (philosophy) - Wikipedia
From a mathematical point of view, a dynamic system that maps onto itself can have fixed points. As a wheel goes round, for instance, the piece of wheel that was at one point at one moment is to be found at another in the next moment, while remaining within the wheel. Fixed point theory predicts that there will be one point which does not move. In the case of a rotating mathematically perfect wheel, this is the centre of the wheel.
On the assumption that god is all there is (which here we assume to be true by definition) all motion is within god, and so we can expect fixed points. The mathematical proofs of fixed point theorems are often non-constructive, that is they follow the via negativa by showing that a logical contradiction would arise were the theorem false. Such proofs probably apply to god. Fixed point theorem - Wikipedia
This leads us to three conclusions. First, motion and stillness are not opposed, as the ancients supposed. Stillness, the fixed points, are just as much part of the motion as the moving points. The fixed points of the world, like ourselves, trees, atoms and stars, which we observe as components of a complex structure, are part of the divine dynamics.
Second, since the fixed points which we observe are divine, they are revelation of god. Instead of having to rely on a few ancient texts and institutions that claim to know the true meaning of these texts, everything we observe in the world and every one of our experiences is experience of god. This revelation serves an an enormous and invincible empirical foundation for scientific theology.
Third, we may understand a dynamical system in action to be simple insofar as it is seamless motion. Our own bodies comprise trillion upon trillions of tiny parts which we think nothing of unless they are damaged or in pain. Although the world appears to have many fixed points like trees, mountains and stars, all these structures are ultimately part of a dynamic whole, some moving faster than others. Although I might think of myself as a person or thing, I am better described as an event, lasting about a hundred years from my birth to my death. Brouwer fixed point theorem - Wikipedia
This theory of fixed points solves another old problem. It is commonly believed that the Christian god is omniscient, which in practical terms means that they have (or are) a database containing every bit of information about the universe from the moment it was created until it ends, if ever. A database, like any script, is a set of marks or fixed points. We noticed in section 5.4 that error free communication requires the use of discrete symbols. The fixed points serve as the discrete points which represent the mind of god. The traditional god is immutable and eternal, having but one form. The fixed points in the divine universe, on the other hand, are eternal as long as they last, but can be annihilated and new ones created. There is much more information in the life of a dynamic god than an immutable one.
Galileo's proscription by the Church marked a turning point in the relationship of scientists to politicians. Galileo lost his battle with the hierarchy of the Roman Catholic Church, but science has come a long way since then and its benefits are indisputable. It is, in a divine universe, an element of theology, and the ancients were right when they felt that a good relationship with the divine environment is the key to prosperity. While many politicians find scientific reality distasteful and would like to muzzle it, many voters can see its enormous benefits and vote to fund the sciences.
Science, like justice, is a continual fight against vested political (including financial) interests. This is why dictators have a penchant for imprisoning and murdering scientists and judges first, as an example to the others. If they do not have the stomach for murder, secrecy and threats are pretty good substitutes. There is nothing worse for a promising career than being forced to become a whistleblower against the industry where one is seeking employment. Karim Shaheen: March for Turkey's jailed judges highights purge on dissidents
The Catholic Church and its theologians have managed to defend the core their theology from the inroads of science, so we are still living by theological ideas that are two thousand years old. One reason for this is that the Church has convinced many people that science is not even relevant to theology. They champion the idea of two truths, one arising from observation of the world, the other from special revelation by god. This is the message of Pope John Paul II's address to the Pontifical Academy of Sciences:
How do the conclusions reached by the various scientific disciplines coincide with those contained in the message of Revelation? And if, at first sight, there are apparent contradictions, in what direction do we look for their solution? We know, in fact, that truth cannot contradict truth. . . .
Consequently, theories of evolution which, in accordance with the philosophies inspiring them, consider the mind as emerging from the forces of living matter, or as a mere epiphenomenon of this matter, are incompatible with the truth about man. Nor are they able to ground the dignity of the person.
That is, the truth established by the ancient magisterium of the Church takes precedence over anything science might come up with. Of course, given the hypothesis of this book, there is no issue. The Universe is divine and humanity obviously participates in divinity and has evolved in the image of god. Science is part of true theology. John Paul II: Truth cannot contradict truth, Magisterium - Wikipedia
The network model discussed in the previous chapter is an abstract formal structure. It is in no way constrained by the fact that in reality all information is encoded physically. So we imagine the countably infinite sets of natural numbers and Turing machines, and use them to generate the uncountably infinite transfinite hierarchy of ever more complex and numerous symbols.
The term infinity does not imply the existence of actually infinite quantities. It serves as a catchall upper bound which tells us that such processes as adding one to an existing number can forever without leading to contradiction.
The next step is to apply the network model to the world of our experience which seems, at first sight, very much smaller than a transfinite network. This is not the case. Mathematics creates formal frames of reference which we use to talk about the world, rather like Descartes' coordinate geometry. Nature does not use frames of reference. Like god, it just is. Every process is local, controlled by contact with its environment, not by reference to some abstract plan which some, like Isaac Newton, believe exists in the mind of god. A big problem for Einstein in formulating the general theory of relativity was to understand how to make a picture of the world that was independent of any system of coordinates.
An important feature of the network model is that is in effect its own coordinate system. Every point in it has its own address built into it and all its interactions with other points are conducted in terms of contacts at their intrinsic addresses, rather than their positions on some map. So as we use our devices to communicate with other people around the world, our principal interest is in our interaction with these people, with no particular concern for their position on the globe.6.3: The mathematical community
Because the network model is symmetrical with respect to complexity we can use it to move up and down from the traditional featureless god to the enormously complex world of daily experience. From an abstract point of view a community is a communication network. The human world is enmeshed in a web of networks beginning with our relationships to ourselves and expanding through individual personal relationships to the huge array of long distance networks established by travel, postal services, telecommunications and the endless audiovisual channels offered by the internet.
We are setting out to match a mathematical model of god to the world of our experience to see if they correspond. My draft of such a model is an honours thesis written in 2019. Jeffrey Nicholls: Prolegomenon to Scientific Theology
The mathematical relationships from which such models are constructed are a product of the mathematical community. We can read the recorded history of this community stretching back at least 5000 years and see its traces in older artefacts. George Gheverghese Joseph: The Crest of the Peacock: Non-European Roots of Mathematics
Mathematics is a cumulative endeavour, each new discovery or invention building on what is already known. Its formal structures, if correctly proven, do not decay and need no maintenance, apart from copying texts that deteriorate and adding new inventions which expand and reinforce the old.
The mathematical community is a layer in the human network whose physical representation comprises all the people involved. In this network, individual mathematicians are the sources and the formal and informal communications between them are the links which bind them into their community.
The formal representation of mathematical proofs and other statements exist outside time. Encoded in an eternal medium or copied before each physical representation fails, they are effectively eternal. The Platonic model of god that we have inherited from antiquity also exists outside time, and is considered to be eternal. Paul Helm: Eternity (Stanford Encyclopedia of Philosophy)
Throughout recorded history, mathematics as been an important tool for understanding our world beginning with arithmetic, accounting for discrete objects like coins and sheep, and extending to the measurement of continuous quantities like land and fabric which inspired geometry. The relationship between physics and mathematics was sealed in Galileo's time when he claimed that mathematics was the language of the universe. Newton took a great step forward when he invented calculus to describe the solar system. Gauss and Riemann extended calculus to define differentiable manifolds which became the mathematical foundation for Einstein's general theory of relativity and the continuous groups of modern fundamental physics. This theory remains our picture of the large scale structure of the universe. Lie Group - Wikipedia
Newtonian physics ruled the world until the middle of the nineteenth century when the relationship between electricity and magnetism opened up a whole new field of study which lay outside Newtonian dynamics. Maxwell showed, however, that it is well within the range of calculus. Maxwell's differential equations opened up a new world of physics by explaining that light is a form of electromagnetic radiation. Around the same time spectroscopists were discovering the close relationships between light and matter and laying the foundations for quantum theory, which has placed new demands on mathematics and provided a rich field of new inspiration.
Quantum theory is now more than a century old and continues to raise more mathematical questions than it answers. The Clay Mathematics Institute is currently offering a million dollar prize to anybody who can make clear mathematical sense of quantum field theory. While the theory has delivered spectacular progress in our understanding the foundations of the world, it still relies on some magical thinking to get results because we cannot really see what is happening down there. A century of ever more powerful particle accelerators and astronomical instruments have delivered mountains of data but theory is struggling to catch up. In particular, there is a big mathematical gap between quantum mechanics and general relativity. Carlson, Jaffe & Wiles: The Millennium Prize Problems, problem 7.
Here I wish to draw an analogy between the mathematical community and the current state of quantum theory, based on the idea that the network model has a foot in both camps. The difficulty we face is outlined in 6.1 above. I am seeking insight by looking at the mathematical community through the lens of quantum field theory. The people in the community play the roles of structural particles, the fermions. The space-time web of communications between the players runs on bosons, the messengers. The field of fermions and bosons binds the community into a functioning whole. By being part of our own communities, we may get some feeling for how the system works.
The mathematical community is a dynamic entity, existing within time, whose communications are fixed points like theorems and discussions of theorems. We imagine that all human communities behave in an analogous manner. While mathematical theorems, if correctly proven, are considered to remain true forever, the communications we exchange in everyday life may be said to be fixed points for as long as they last, momentarily eternal. Our dynamic universe seems to be much bigger than the ancient immutable god since it comprises a long evolving sequence of eternal spacelike slices, each as big as the universe so long as it lasts.
The Platonic view of mathematics is that its theorems have some sort of independent existence outside humanity so that a mathematician producing a new proof is not so much creating something new as discovering something previously unknown. The situation is analogous to an explorer coming across a landscape that has never been seen before by human eyes, even though it may have existed in place for thousands or millions of years.
In the print oriented human world one must publish a discovery in some permanent and widely respected form to gain credit. There is work for historians of science to decide priority if an idea is published by many people at about the same time. No one who had the idea and did nothing about it enters into the discussion. A vast treasury of mathematics may exist in the Platonic heaven, but we know nothing of it until it is published. Events may exist in the mind of god, but we cannot know about them until they happen. David J. Gross: Nobel lecture: The Discovery of Asymptotic Freedom and the Emergence of QCD
There is considerable difficulty, as we discussed above, reconciling the eternity of god with the life of god, since life to us means motion. Here in the divine Universe, we understand all observables to be fixed points in the divine dynamics. We cannot see the dynamics, only the fixed points. In the spirit of science, therefore, we can say nothing about the invisible dynamics except what we can infer from what we observe.
All the communications with the community appear as physical representations of information, in speech, writing, images and so on. But there is also much more information represented in the minds of mathematicians, which we know to be physically represented by their bodies, particularly in their central nervous systems, and these dynamical states are open to being represented in suitably large Hilbert spaces.
Quantum field theory is built around the idea of gauge theory which assumes that in the world of fields there is a lot going on that we cannot see. All that we have to go on are the things that we can see, so that we assume that many details of what is happening in the invisible world must cancel out or be irrelevant so as to leave us with what we do see. This idea is called gauge symmetry. A mathematically perfect wheel is symmetrical: we cannot tell if it is spinning or not until we break its symmetry by putting a mark on it. In the mathematical community we may say that the observable output is the published theorems. All the flow of education, exploration and discussion that leads to the theorems is from this point of view invisible, the symmetry which is eventually marked by the emergence of a theorem. We see a similar phenomenon in our parliaments. Their practical output is legislation. Behind the legislation is an invisible mountain of talk and inquiry that feeds the construction of legislation.
I easy to see where all the talk comes from in our communities. Where does the analogous invisible background come from in physics? It arises from our need for frames of reference to measure and construct things. A builder is presented with a set of plans, drawn by an architect, and a piece of land. The first step is to use the dimensions on the plan to drive in some pegs to outline the foundations. This is pretty easy. The land does not move and the measurements can be laid out using a tape and a bit of geometry. An astronomer trying to plot the motions of the solar system has to take into account all the motions of the planets and moons in three dimensional space. Not so easy, but after centuries of work astronomers have devised a number of reference systems. The equatorial system, for instance, is fixed by the direction of the Earth's poles and the intersection of a projection of the equator with the Earth's orbit at the March equinox. Different systems can be interchanged by appropriate geometric calculations. These coordinates, measured to high precision, are used for astronomical and cosmological work. Celestial coordinate system - Wikipedia
The establishment of coordinate systems for quantum mechanics is more difficult. It goes beyond geometry into establishing relationships between the infinite number of states occupied by an enormous variety of microscopic particles. These particles range in size from large molecules like nucleic acids and proteins to fundamental particles like electrons, photons, quarks and gluons. These states are represented by state vectors, |φ> in Hilbert space, a linear complex vector space with any number of dimensions. John von Neumann The Mathematical Foundations of Quantum Mechanics
The most interesting feature of quantum mechanics is superposition. Since quantum theory, like the theory of computer networks, is indifferent to the complexity of the system being studied, superposition can be illustrated using a two dimensional Hilbert space with basis vectors |0> and |1>. We can form a state vector by adding them together to form a superposition called a |qubit>: |qubit> = a|0> + b|1>, where a and b are complex numbers such that |a|2 + |b|2 = 1. This superposition is not directly observable. When |qubit> is observed, we see the value corresponding to |0> with probability |a|2, which is a real number, and the value corresponding to |1> with probability |b|2. Why this works is known as the quantum measurement problem. It appears that a lot of the information encoded in |qubit> is lost in the measurement process.
It is difficult to decide if this lost information really exists in the first place. Given that we can (theoretically) make an infinity of different superpositions in a two dimensional Hilbert space, we must wonder where all this information stored. Do the fields proposed by field theory really exist, or is all the information stored in the particles?
The "mathematical field" in the mathematical community exists in the minds of the mathematicians and their communications. How do we model this field as a local gauge symmetry? One point to note is that given all the human languages that may be involved each core mathematical idea may have a large space of expressions which can be translated or transformed into one other. Here I will assume, by analogy with the mathematical community, that the information attributed to fields is stored in or represented by particles. I presume that an electron, like a mathematician, has a personality that guides its interaction with other particles.
6.4: Divinity to trinity
We now turn to our deepest physical foundations, the lower physical layers in the universal network. We have previously noticed that we can imagine two structureless sources of the Universe: the traditional Christian God and the initial singularity predicted by general relativity. Gravitational singularity - Wikipedia
The line of thought developed in this essay started in 1965 when I read Bernard Lonergan's claim that the existence of 'empirical residue' shows that the Universe is not god. This did not seem right to me. I felt that every datum, that is every event, has a history stretching back to the beginning. God after all, is not an event, it just is. Nevertheless we understand fixed point theory to predict that a god will embody fixed points that are related to one another and given meaning by the underlying dynamics. Lonergan: Insight: A Study of Human Understanding,
In chapter 1 I mentioned that way back in my monastic days I guessed that the theory of the Trinity might provide a means to link the absolute simplicity of the traditional god to the unlimited complexity of the observable world. Fifty years later this still looks good. Now I see the Christian doctrine of the Trinity in the light of fixed point theory. The first steps in the differentiation of god were already taken for us long ago in the theory of the Trinity developed by Augustine and Aquinas and studied in modern times by Bernard Lonergan. They now merely require translation into the modern idiom and extended from a duality of two personae (Father and Son) to the third person of the Trinity (the Holy Spirit) and beyond to a transfinite number of persons (sources). Trinity - Wikipedia, Lonergan (2007): The Triune God: Systematics
The ancient theologians needed to find a way to reconciling the unity of God with the multiplicity of personalities suggested in the Bible. Aquinas, following Augustine dealt with this problem using a psychological model drawn from ancient Greek psychology. This idea first appears in John's suggestion that the second person is the Father's Image of Himself, not an abstract image as might occur in a human mind, but a concrete image, as real and divine as the Father. The third person of the Trinity, the Holy spirit, is understood as the realization of the love between Father and Son. The constitution of God was mapped onto ancient ideas of the human spiritual constitution, consistent with the idea that we are conscious beings, created in the image of god, capable of reflecting on ourselves and loving our reflection. Aquinas, Summa, I, 27, 1: Is there procession in God?, Aquinas, Summa, I, 34, 2: Is "Word" the Son's proper name?
Many churches consider the trinity to be a revealed truth, not open to either explanation or contradiction. What is important, from our point of view, is that the traditional god can be many. The fact that the observed Universe is extremely complex does not, therefore, immediately rule out the claim that the Universe is divine.
The Trinity is an example of the basic unit from which networks are constructed: two sources and a link between them. In quantum mechanical terms, two fermions and a boson. Networks expand by copying themselves, adding more and more sources and linking them together. The Australian project to build a National Broadband Network is an example of this process. We see it also everywhere is life, organisms growing by cells duplicating themselves and the daughters differentiating and remaining in close communication with one another.
We assume that the initial singularity shares the properties of the traditional god and is capable of reproduction and differentiation. If it was not, we would not be here. Although the psychological model of the Trinity provides a credible model as an aid to belief, there remains a problem. While all three persons are understood to be identically god they are also held to be really distinct. The contradiction involved in this picture is represented by the Scutum Trinitatis (Shield of the Trinity), a graphic representation of the Trinity. To avoid this contradiction, we introduce space. Logically, we define space as a situation where distinct objects, p and not-p, can exist simultaneously. Traditional theology claims that god is not a body and so does not occupy space. If we are to have many identical things, however, it seems that we need space to distinguish them. In 6.7 we add more detail to this idea. Fundamental particles are divided into two classes, bosons that are attracted to existence in the same place, and fermions that resist existing in the same place. Shield of the Trinity - Wikipedia
6.5: Physical evolution I: Action to energy
We imagine that the fixed points emerge in the divine dynamics in an orderly sequence, beginning with very simple systems which gradually became more complex by a process of differentiation and evolution. This fits our observations of the history of the Universe. We begin, like Aristotle and Aquinas, with an absolutely simple god of pure action. In Christianity, the first bifurcation of the divinity is into Father and Son. Here we guess that the analogous bifurcation was the origin of time and energy.
This identification is suggested by classical dimensional analysis which sees physical parameters in terms of the dimensions mass, length and time, M, L and T. The first compound we construct is velocity, distance divided by time, which we write LT-1. In classical physics, kinetic energy is given by the formula KE = ½mv2. From this we conclude that the dimension of energy is ML2T-2. Dimensional analysis - Wikipedia
Action, in modern physics, is the time integral of energy which, at its simplest, means energy multiplied by time. The dimension of action is therefore ML2T-2 × T = ML2T-1. We might see time and energy are a duality created by the emergence of these two observable fixed points from pure action. One of the fundamental equations of quantum mechanics is E = hf where f frequency is the inverse of time. A third definition of energy is work, the product of force and distance. Force is mass × acceleration, so once again the dimension of energy become MLT-2 × L = ML2T-2.
In quantum theory time is measured as a count of actions, like the ticks of a clock. We see action as the ubiquitous, primordial and undifferentiated, an atomic unit which appears in the observed world as the quantum of action. In the model proposed here, every action is a representation of the initial singularity so we take the initial singularity to be the primordial act, consistent with the idea proposed by Aristotle and Aquinas that god is pure act, actus purus
Logically, an action is simply something that changes some p into some not-p. When I move my hand it goes from here to there, which is not-here. An action has no intrinsic physical size, so there is no problem identifying it as both divine and the smallest entity in the universe, measured by the quantum of action. This is consistent with Aquinas's idea that god is ubiquitous in the universe. Aristotle's definition of time: the number of motion according to before and after seems consistent with this idea. So we take the first step in the emergence of the current universe from the initial singularity to be the bifurcation of pure action into energy and time. Another way of saying this is that the symmetry we call action has broken to energy and time. In terms of our computer model, we now have the foundation of a real computer, a clock that synchronizes all the logical activity in the machine. Aquinas, Summa, I, 8, 1: Is God in all things?Aristotle: time, Planck constant - WikipediaTuring's mathematical computer, like all of mathematics, is formal and outside time. Turing's idea has evolved through many physical implementations to become the computers we now use. Apart from synchronization, the other role of the clock in a computer is to hide the physical dynamics behind the logical process. Given a machine in an physically stationary state, the clock emits signal which sets everything in motion, carrying the computation one step forward. After enough time has elapsed for the the physical processes to reach equilibrium, the clock emits a second signal which freezes a snapshot of the new state. Turing noted that a human computer goes through a similar routine. Given a written copy of a certain stage in a computation, the computer then works to carry the computation one step forward and writes down the result. They may then stop work and hand their result to the next shift, which has all the information necessary to carry the computation forward another step.
How do particles evolve and become more complex? Evolution requires memory to carry information from generation to generation. In living creatures this memory is provided by the nucleic acids DNA and RNA. This system dates from close to the origin of life. Genes for some of the fundamental metabolic processes, like the citric acid cycle that plays a key role in human energy metabolism, are found in the Archaea which evolved about three billion years ago. These genes have have been reproduced through trillions of generations to become part of our bodies. Citric acid cycle - Wikipedia, Archaea - Wikipedia
How did this structure come about? There is a lot of speculation, but no certainty because it happened abut three billion years ago and we have almost no evidence to go on except a few fossils and laboratory and computer simulations. Given the existence of gene based life, however, the process of evolution from singe celled organisms to the present is relatively transparent since early bacteria are still with us and open to study. Earliest known life forms - Wikipedia
Traditional western theology imagines that the universe was intelligently designed and powerfully created by an omniscient and omnipotent god. Here I identify the initial singularity as the starting point of the universe and identify it with the god of Aquinas because both share the attributes of existence, absolute simplicity and creation of the world. The absolute simplicity of the creator presents a cybernetic problem, insofar as the principle of requisite variety prohibits a simple system from controlling a complex one (5.5). On the other hand, it seems reasonable to attribute omnipotence to both the traditional god and the initial singularity since the only constraint on each of them is the internal consistency. Neither is subject to external constraint.
The scientific approach to physics and cosmology requires that we take the world as we find it and attempt to understand the processes that make it behave as it does. This project made a great leap forward in the twentieth century with the discovery of quantum mechanics and relativity, but has left us with many puzzles, some outlined in sections 6.1 - 6.3. This approach, common to all science, is an attempt to understand the past from evidence obtained in the present. Implicit in this approach is the scientific belief that our observations of the world and the mechanisms that lie behind these observations are consistent. If we do find inconsistencies, we may take this as evidence that we are on the wrong track and need to recheck our models and observations. Apparent inconsistency is the driving force of scientific progress.
Another approach that supports the idea of understanding the past through the empirical present is the anthropic cosmological principle. The idea here is that the Universe was deliberately constructed by a designing creator to allow for our evolution. This conclusion arises because some see evolutionary bottlenecks which require precise tuning of various physical parameters to arrive at conditions conducive to life. One of these concerns the creation of carbon itself. We understand that heavier elements are synthesized by fusion of lighter ones. It turns out that there is no way to make carbon except by the fusion of three helium nuclei. This seems at first sight a very improbable event, which is nevertheless made possible by a couple of coincidences which may have been designed in by a creator. Anthropic principle - Wikipedia, Barrow & Tipler: The Anthropic Cosmological Principle
The first of these is a resonance of beryllium-8 which increases the probability of fusion of two helium-4 nuclei. The second is the existence, predicted by Hoyle, of an excited state of carbon-12 which encourages the fusion of beryllium-8 and helium-4. The anthropic argument suggests that these resonances might have been designed in to the Universe to favour the evolution of carbon and ultimately the evolution of carbon based life forms. Triple-alpha process - Wikipedia
An alternative to working from the present, which includes our existence, to the past is to work from the past to the present. This approach is made possible by the fact, derived from general relativity, that the initial singularity has no structure and presumably zero entropy. The problem raised here is the same as we faced when dealing with the absolutely simple classical god, the principle of requisite variety. The initial singularity with entropy zero has no a priori power to constrain the universe. This is not only a difficulty, it is also an advantage. Since the initial singularity has no power to constrain the universe, we would expect the universe to span the fullness of possibility. This universe should therefore have the same power as the omnipotent classical god, which is also limited only by consistency. It may also be taken to mean that the initial singularity is a necessary being in that its essence is identical to its existence, its essence is to exist. Aquinas, Summa: I, 25, 3: Is God omnipotent?, Aquinas, Summa, I, 3, 4: Are essence and existence the same in God?
I am inclined to believe that evolution began at the beginning. We are proposing to describe the structure of the universe using the computer network described in chapter 5. The simplest operation a computer can perform is to do nothing. As long as the initial singularity does nothing, we can think of it as eternal. The first meaningful operation is the logical not, written not or ¬. In the world of binary binary logic ¬ ¬ p = p
If we think of a wave as the sequence up-down-up-down . . . and understand down = ¬ up, we may see a wave as a sequence of not operations, and interpret a quantum of action executing such a sequence as a particle emitting energy. At this point we imagine that the duration of each cycle is random, but that the process is nevertheless cyclic. We propose cyclic reproductive behaviour as a criterion for the permanence of structures and so for the group like behaviour of much of what we observe in nature. The emergence of energy is the first step in the evolution of the universe. At this point we have no space but two time division multiplexed states, which we have called up and down, more abstractly p and not-p, more concretely potential and kinetic energy.
6.6: Physical evolution 2: energy to entropy
In the absence of an omniscient divinity to create the details of the world, we need another mechanism. Broadly, we understand the initial singularity to be a source of random action and propose that some of these sequences of action become stable structures by establishing recursive closure so that they are able to maintain their own existence. We see this already in the generation of energy by the logical operation not, since in the binary system, not-not-p brings us back to p. In the vastly more complex context of life species maintain their existence by continuously reproducing new individuals so that death is overcome by birth in a cyclic process closely analogous to a wave. We can add detail to this idea using the network model.
The conservation of energy is now recognised as a fundamental symmetry of the physical world. An important feature of communication networks is their layered structure. Each layer provides services to the layer above it so that it is in the interests of the upper layers to maintain the lower layers, since the higher layers cannot exist without their support. Peers at any level in a network communicate by sending a signal down through the network layers to the hardware layer which carries a physical representation of the information between them. The hardware signal is then translated up through the recipient network into form intelligible to the recipient peer. All communication must go through the hardware. We understand the initial singularity to be the ultimate hardware of the universe. All users of the universal network therefore share the energy of the universe, which we may understand to account for its conservation. Since the creation of energy is a random event shared by all agents in the universe the fact of conservation of energy does not imply any absolute value of the energy of the universe, opening the way for the notion that the net energy of the universe may be zero. Conservation of energy - Wikipedia, Zero-energy universe - Wikipedia
Symmetries in general are features of lower layers of the universal network which are broken by the higher layers. They are the unchanging features of the moving world first imagined by Parmenides. Logically, we may think of a symmetry as an algorithm which remains constant as it is instantiated for different tasks by its users.
The conservation of energy is now known as the first law of thermodynamics. The second law of thermodynamics tells that the universe has a general tendency to increase its entropy, that is to increase its count of distinct states. More simply, it is creative and our principal task here is to understand how the creator manages to create itself..
Energy is easy to understand since it is simply the time rate of action expressed by the fundamental equation of mechanics E = hf. Entropy is rather more subtle but in a sense simpler, since it is merely a count, without any physical dimension. It is a dimensionless measure of creation. The physical concept entropy entered the world with the invention of steam engines. The French physicist Sadi Carnot first hinted at entropy in 1824 in his Reflections on the Motive Power of Fire. He understood that heat is a form of energy, and that the extraction of mechanical energy requires the passage of heat from a hot source to a cold source. He invented a reversible cycle, the Carnot cycle, to achieve this and was able to derive a formula which predicted the maximum proportion of heat energy that could be extracted as mechanical energy by running the cycle between two temperatures T1 (high) and T2 (low). This proportion, the efficiency η of the ideal heat engine is given by (T1 - T2) / T1. The Carnot cycle is reversible, so it can also model refrigeration, using mechanical energy to cool a hot source. Reflections on the Motive Power of Fire - Wikipedia
Further developments of Carnot's idea by Emile Clapeyron and Rudolf Clausius crystallized the notion of entropy. Entropy is the quantity that is conserved in the Carnot cycle: any reversible process must conserve entropy. The second law of thermodynamics states that entropy never decreases. Both the first and second laws apply to closed systems, which we take the Universe to be.
In classical thermodynamics, entropy is the inverse of temperature. The higher the temperature of a given quantity of heat, the lower its entropy. The assumption in the big bang model that the Universe began at a very high temperature suggests that its entropy was very low. The consequent fall in temperature at constant energy is equivalent to an increase in entropy, consistent with the second law. One of the difficulties with the usual understanding of the initial singularity and the big bang is that the universe started of as point of infinite energy and temperature, which is makes little sense physically.
The entropy concept gained a second life with the development of the mathematical theory of communication. As we have seen (5.4), Shannnon defined the entropy of a communication source as a function of the number and frequency of the letters of the source alphabet. Here entropy is interpreted as a measure of uncertainty. Information, which reduces uncertainty, is then measured as the reduction in uncertainty caused by the receipt of the information. The unit of uncertainty, the bit, is measured by the choice between yes and no. A game like twenty questions, which involves 20 yes/no answers, thus provides 20 bits of information, sufficient to decide between 220 = 1 048 576 possibilities.
The mathematical measure of entropy in communication theory is consistent with the thermodynamic measure. Carnot and his contemporaries did not know that the water and steam in their engines comprised vast numbers of molecules. Later, Ludwig Boltzmann began to study thermodynamics from a molecular perspective and realized that the entropy of a substance is a function of the number of its internal states. He found the relation S = k log W , where S is entropy, k is the Boltzmann constant which relates the count of microstates to metric units, and W is the number of internal states or 'complexions' of the system measured. This shows that reversibility is a a consequence of the cybernetic principle of requisite variety: number of states in = number of states out and vice versa. Boltzmann's entropy formula - Wikipedia
Entropy has had a bad press over the years since increase in entropy became associated with an increase in chaos. Ideal mechanical energy has zero entropy, so designers of heat engines saw entropy as opposed to thermal efficiency. Entropy as a measure of information, on the other hand, lies at the foundation of cybernetics and information theory.
We have already noticed that the cybernetic principle of requisite variety tells us that a complex system can only be controlled by a system of equal or greater entropy or complexity (5.5). Gregory Chaitin has shown that this principle is a consequence of Gödel’s incompleteness theorem:
Gödel's theorem may be demonstrated using arguments having an information-theoretic flavour. In such an approach it is possible to argue that if a theorem contains more information than a given set of axioms, then it is impossible for the theorem to be derived from the axioms. In contrast with the traditional proof based on the paradox of the liar, this new viewpoint suggests that the incompleteness phenomenon discovered by Gödel is natural and widespread rather than pathological and unusual. Gregory J. Chaitin: Gödel's Theorem and Information
Traditional theology maintains that the combination of the divine omniscience and omnipotence gives God the ability to completely know and control every detail of the future. Requisite variety invalidates this claim. Insofar as God is understood to be absolutely simple, its variety is zero, and so its powers of knowledge and control are absent. This raises the problem at the heart of this essay: How could the universe be the work of a divine intelligent designer when the divinity is so simple that it cannot control anything? We have already noted the first step to the answer: random uncontrolled processes can generate new states, and so may be seen as a source of entropy. At this point, however, we may imagine new states to be annihilated as fast as they are created so that although energy may be conserved, entropy remains ephemeral.
6.7: Quantum theory and the creation of space-time
We are trying to put together a universe that looks like a gigantic computer network, starting from the basic functions of Boolean algebra, not and and. The foundation of any computer system is its operating system. The basic roles of the operating system are to manage communication and memory. Here we set out to conceive space-time as the operating system of the universe, the representative domain of the transfinite computer network.
Much thought in physics has gone into two questions: how do we quantize gravitation; and what really happens when we make a quantum measurement. Both questions relate to the interface between the quantum and classical worlds and both may be the wrong questions. In Einstein's Mistakes, Steven Weinberg writes:
The Copenhagen interpretation describes what happens when an observer makes a measurement but the observer and the act of measurement are treated classically. This is surely wrong: Physicists and the apparatus must be governed by the same quantum mechanical rules that govern everything else in the universe. But these rules are expressed in terms of a wave function (or more precisely a state vector) that evolves in a perfectly deterministic way. So where do the probabilistic rules of the Copenhagen interpretation come from? . . . It is enough to say that neither Bohr nor Einstein had focussed on the real problem of quantum mechanics. The Copenhagen rules clearly work, so they must be accepted. But this leaves the task of explaining them by applying the deterministic equation for the evolution of the wave function, the Schrödinger equation, to observers and their apparatus.' Steven Weinberg: Einstein's Mistakes
Perhaps the distinction between the quantum and classical worlds is a furphy?
Going a bit further, perhaps gravitation does not need quantization because it is already what we might call naked quantum mechanics a hybrid of quantum of classical physics. Most of the time we are looking at the outsides of particles, electrons or people, and trying to work out what is going on inside. In the case of gravitation, however, we are inside the particle, the universe, in the midst of the process that shapes the universe.
Energy as we have explained it is a wave, the time division multiplexing of action. If we are to have universe of zero energy from the beginning, we may intepreting this wave as a harmonic oscillator like a pendulum, cycling between kinetic and potential energy. If we compute the action of a formally perfect pendulum over a long period of time using the Lagrangian we find:
H = ∫ (KE - PE) dt = 0
since its potential energy energy is equal to its kinetic energy. The potential and kinetic energy of a pendulum are equal because gravitation is a conservative field: the kinetic energy of a freely falling particle increases at exactly the same rate as its potential energy decreases so the process is (energetically) reversible and the entropy of the system remains (theoretically) constant over time.
An isolated quantum process, going its own way at constant energy without any outside influence is also reversible. Mathematically this is guaranteed by the wave equation which preserves unitarity. We may interpret this in terms of communication to be the the work of a reversible codec (6.1) transforming the state vector. Mathematically this codec is represented in quantum mechanics as a unitary operator, so we are led to suspect the operation of unitary operators in the gravitational transformation of free fall.
A consequence of unitarity is that a normalized quantum system maintains its normalization through time and if identically prepared systems are observed often enough it will be found that the sum of the probabilities of the outcomes is 1, as predicted by the Born rule. Mathematically, this is a consequence of normalization and shows that the possible outcomes of observation form a complete system of independent events, a property shared with a communication source described by the mathematical theory of communication,
Newton's theory of gravitation has one glaring weak spot, the assumption of action at a distance. It seemed to fly in the face of the common sense view that action requires contact. Einstein solved the problem by producing as field theory of gravitation, and all the other departments of physics have followed suit.
The layered network model identifies symmetries in the universe as the footprints of lower layers which have been applied by higher layers for their own purposes, just as the simple algorithms of arithmetic are instantiated by their users in every application. We might use this model to explain the role of the velocity of light in the emergence of Minkowski space-time as an application instantiated from the underlying quantum mechanics of pure energy. The reason for this is that the null geodesics made possible by the Minkowski metric enable systems to maintain contact (and therefore causality) over all but space-like intervals. Before space emerged, all processes were naturally in contact. This hypothesis gives us some insight into the both the quantization of gravitation and the events we call quantum observation or measurement.
The heart of Newtonian physics is to be found in the differential equation F = ma, the classical coupling between force, mass and acceleration. The equivalent in basic quantum mechanics is the equation E = hf the logical coupling between energy and processing frequency via the quantum of action. Quantum mechanics and gravitation are the fundamental symmetries of a layer in the universal network that sees only only energy. They are applied by the layer above them to create the spacetime which serves as the operating system of the universe, managing memory and communication.
We understand a particle as a quantum of action embodying a logical process. While not is the simplest such process, we imagine that particles may embody the logical equivalent of any halting turing machine. Larger particles (like myself) may embody layered networks of such machines. We understand all the operations in a local computer to be synchronized by one timing signal. The machines in a network, on the other hand, may be spatially separated and run on independent clocks, introducing a probabilistic element into their interaction. Time-division multiplexing - Wikipedia
I may think of myself as a complex web of network processes which embody every physiological detail of my life down to the level of fundamental particles. I am a particle whose mass is approximately 100 kg occupying about 100 litres of space. From my mass and lifetime I can calculate that the overall human-size quantum action which constitutes my life comprises some 1060 Planck-size quanta executed over 100 years at the rate of about 1050 quanta per second blending seamlessly to make me act as I do. Since the universe may last forever, we cannot calculate its lifetime action, but a plausible mass might be 1052 times greater than mine, so that it is executing about something like 10100 Plank quanta per second. This we might interpret as the rate of interaction of its population of fundamental particles. Thinking along these lines, we may consider the universe itself as a particle containing a web of network processes. Observable Universe - Wikipedia
Quantum mechanics is a hypothetical mathematical description of the processes driving the four dimensional space-time in which we live our lives. These processes are not visible and we understand them be in perpetual motion as time goes by. The description of time and energy given in 6.4 above is not continuous but sees both time and energy progressing in discrete step one quantum of action at a time. If we put f = 1 in the equation E = hf we see that the quantum of action is both a unit of energy and the unit of time, as we expect when the symmetry underlying the energy and time is action. This is consistent with my feeling that continuous quantities have only a platonic existence and the minimum physical representation of information is the quantum of action.
An important mathematical feature of quantum theory is that it is indifferent to the complexity of the situation which it describes. The complexity of any quantum situation may be measured by the number of dimensions of the Hilbert space in which it is modelled. All the features of quantum mechanics are present in a two state system in a two dimensional Hilbert space. Quantum mechanics can carry us from the simplest two state systems to the mathematical limit for the description of consistent systems, that is to the boundaries to logical certainty established by Gödel and Turing.
The mathematical structures described by the quantum theory are symbolically represented in the literature but the corresponding physical entities are invisible to us. In this the theory differs completely from classical mechanics which is about the motions of clearly visible objects like planets, pistons, gears and pool balls. This difference is the source of the "measurement problem" which has been a bugbear of quantum theory since the beginning. Measurement problem - Wikipedia
This invisibility is not surprising. We communicate with one another in the classical world by body language, which includes such physical things as the sounds of speech, facial expressions, gentle and loving touch and physical violence. We cannot observe the hidden processes that lie behind this language, although, from our experience of ourselves, we can form a pretty good idea of what people are thinking when they act in certain ways. We know that the psychological processes behind our behaviour are the result of physical processes in our nervous systems, and that these hidden processes are often a lot more complex than the physical result. We say that a wink is as good as a nod, but there are circumstances like auctions and romances where hours of thought may go into the decision to wink and or nod, let alone say something.
The same thing happens with computing machinery. This computer chugs along at a billion or so logical operations per second. Mostly it is just marking time, waiting for me to hit a key. Then it sets to work, invisibly, to process the keystroke, which may simply mean write a comma on the screen. Having done this it stops work and begins marking time again unless there is some housekeeping to do. In the computer theoretical world, finishing a task is called halting. The computer reveals the result of its work, by printing a comma on the screen. Turing invented his machine to solve Hilbert's decision problem and showed that there are problems that require a computer to make an impossible decision so it can never halt.
Another practical reason for the invisibility of processes is that communication is itself a logical process. If a machine (person) is fully occupied in some process they may not have the resources to explain what they are doing. Because we are large and complex organisms, we can do two things at once but down in the simple foundations of the universe, it is necessary for a process to stop what it is doing to communicate its results.
To get results from quantum mechanics the invisible substratum of probability amplitudes must be 'measured' or 'observed'. Although it is often said that observation involves the interaction of a classical system with a quantum system, we know that all systems are quantum systems, so observation involves physically embodied quantum systems, ie particles, interacting with one another to yield observable events. As noted above, we place no limits on the size of a particle.
The quantum theory presents information through two channels, one mediated by the Born rule and the other by the eigenvalue equation. The Born Rule predicts the probability of events whose formal nature is described by the eigenvalue equation. The eigenvalue equation picks out fixed points in quantum process which become visible because they are fixed. Born rule - Wikipedia
In section 5.6 above we summarzized Feynman's succinct description of the quantum process. We may interpret computation of a quantum probability as the extraction of a classical fixed point out of the dynamics of an amplitude.
The probability of an event in an ideal experiment is given by the square of the absolute value of a complex number φ which is called the probability amplitude:P = probability;
φ = probability amplitude;
P =|φ|2When an event can occur in several alternative ways, the probability amplitude for the event is the sum of the probability amplitudes for each way considered separately. There is interference:
φ = φ1 + φ2;
P = |φ1 + φ2|2If an experiment is performed which is capable of determining whether one or another alternative is actually taken, the probability of the event is the sum of the probabilities for each alternative. The interference is lost:
P = P1 + P2
He concludes:
One might still like to ask: “How does it work? What is the machinery behind the law?” No one has found any machinery behind the law. No one can “explain” any more than we have just “explained.” No one will give you any deeper representation of the situation. We have no ideas about a more basic mechanism from which these results can be deduced. Feynman, Leighton & Sands FLP III:01: Chapter 1: Quantum behaviour
We represent an amplitude by a complex number which we write φ = x + iy. We imagine this was a wave or a wheel spinning in one direction. We calculate the probability by the absolute square of a complex number which we obtain by multiplying the number with its complex conjugate, which we write φ* = x = iy, which we imagine as a wave or wheel going in the opposite direction. The multiplication yields a real number P = |φ|2 = (x + iy) × (x - iy) = x2 + y2. As Feynman notes, we cannot explain why this works, but it does, and it may be another example of the phenomenon observed by Wigner that mathematics is unreasonably effective in the physical sciences. Eugene Wigner: The Unreasonable Effectiveness of Mathematics in the Natural Sciences
The second source of information in quantum theory is the eigenvalue equation which picks out fixed points in the quantum dynamics. Quantum information is encoded in the phase or angle of vectors in Hilbert space. The Born rule uses a metric in Hilbert space, the "inner product" to compute the amplitude, that is the "distance" (in phase) between two states which measures the probability of transition between them. The eigenvalue equation identifies the operations on state vectors which do not change their phase. These state vectors are the ones we see when two states interact. Zurek points out that this selection is necessary for the transfer of information between the particles represented by the state vectors. Wojciech Hubert Zurek: Quantum origin of quantum jumps: breaking of unitary symmetry induced by information transfer and the transition from quantum to classical
All the known fundamental particles fall into two classes fermions and bosons. Fermions play the role of communication sources and their principal property is that any two of them strongly resist occupying the same state in spacetime, the Pauli exclusion principle. They are therefore the quantum mechanical foundation for spatial extension, implementing the logical definition of space in 6.4. Bosons play the role of messengers, carrying information between fermions. They do not obey the exclusion principle. The difference between fermions and bosons is established by the Spin statistics theorem. The proof of this theorem in quantum field theory depends on the commutation relations fields mapped onto Minkowski metric of space-time. It may be, however, if quantum theory represents a network layer beneath spacetime, that the bifurcation of commutation relations accounts for the Minkowski metric rather than vice versa. Streater & Wightman: PCT, Spin, Statistics and All That pp 146 sqq, Spin-statistics theorem - Wikipedia
6.8: Measurement, creation and insight
In the absence of the traditional creative divinity, we assume that the universe creates itself, that is it makes itself more complex, or increases its entropy. The contact of two quantum systems brings a new observable entity into the world, and possibly destroys others in the process. This is the normal procedure in the accelerator experiments that provide us with much of the information we have about fundamental physics. This question is: does this and other quantum interactions increase entropy?
Feynman's lecture highlights one of the most interesting features of quantum mechanics: some feel that when we look at quantum systems we change their behaviour. An isolated quantum system is believed to evolve deterministically as described by the Schrödinger equation (6.2). This equation has a number of solutions corresponding to the the number of dimensions of the Hilbert space to which the equation is applied. All these solutions are considered to exist simultaneously, added together to give a state of "superposition" analogous to the superposition of overtones in a musical note. When we observe the system, however, we see only one of these solutions, and if we quickly observe it again, we see the same state again. This state of affairs is called "the collapse of the wave function". This collapse seems quite likely to be an artefact of the mathematical theory. It is not clear that the solutions of the wave equation are all physically represented in the first place.
Many discussions of quantum measurement trace its peculiar behaviour to the presence of a conscious physicist doing the measuring. Here we take the view that the universe is continually observing itself through every quantum interaction of physical particles. The increase in entropy arising from quantum interaction suggests a close relationship between communication and creativity. This is consistent with everyday experience. We learn by meeting one another.
Why does the world create itself? The formal answers may lie in a combination of fixed point theory, which suggests that a closed dynamic system has fixed points, and Cantor's theorem, which suggests that it would be inconsistent for the world not to complexify. From Cantor's construction of the transfinite numbers, we guess that the key to complexification is combination and permutation. The general idea seems to be an evolutionary process. A system of pure activity will be inclined to try everything (variation) and those variations which involve local inconsistencies will be weeded out (selection).
In the context of the transfinite network, we understand the act of insight to be scale invariant, working at every level from quantum mechanics to individual human understandings and beyond that to the understanding of communities, nations and the whole human noosphere (6.3). Insight, thus understood, is the source of fixed points in the divine dynamics. In the theology of the Trinity, we may understand the procession of the Word, a new fixed point in the divinity, as an act of insight.
The paradigmatic act of insight invoked by Bernard Lonergan in his eponymous book is Archimedes' realization (in his bath) that the upward force on a submerged body is equal to the weight of fluid displaced. We all experience the act of insight: the often sudden moment of 'seeing' or understanding something. We recognise and observe such isolated acts because they are isolated. Here we understand insight in terms of communication as the act of decoding a message. In normal conversation this action is effectively instantaneous, and we rarely have to stop and think before we can work out what the person speaking to us means. On the other hand, some things can take a long time to understand. One may stare at a chessboard for hours and still not see what to do next. People studied motion for millennia before Einstein arrived at the theory of relativity.
We may understand insight as the conscious recognition that we have found a consistent path through a body of data which amounts, in surveying terms, to a closure. A surveyor may be confident in their observations and calculations if after having worked their way around a circuit back to their starting point they find that their measured and calculated distance from the starting point is close to zero.
A system which moves from a certain starting point through a sequence of operations back to its starting point is cyclic, like a wave. We have already suggested that the logical not operation provides a mechanism for the emergence of energy from action (6.5), the result being a wave each cycle of which corresponds to a quantum of action. The length of this cycle is inversely proportional to the energy involved. This is the simplest possible cycle. Another more complex cycle is the algorithm of reproduction through which living species tend towards eternal life. Saltwater crocodiles reach sexual maturity at about 15 years and their species diverged from their ancestor about 10 million years ago, so the "crocodile wave" may have executed about 600 thousand cycles. Modern humans, on the other hand, emerged about 300 000 years ago, so we are only about 20 thousand generations old.
Energy is conserved because it is too simple to die. Structures made of energy, like crocodiles can die due to their complexity but their power of reproduction greatly extends the life of the species beyond the live of individuals. From an Aristotelian point of view, we might consider energy as the matter from which all else is constructed, energy being able to take a huge variety of different forms.
Quantum mechanics does not distinguish between potential and kinetic energy. Both enter its equations on an equal footing, and all that counts is frequency differences and the fixed points or nodes that are revealed when these frequencies are superposed. Quantum superposition - Wikipedia
This symmetry in quantum mechanics suggests that it is blind to space and reveals to us only the nodes of the cosmic harmony, which we represent by the eigenvectors and corresponding eigenvalues of quantum transformations. Energy in itself is also blind to the distinction between space and time, and so we see basic quantum mechanics as the study of harmony in an infinite dimensional complex 'frequency space' or 'energy space' which exists without reference to the three dimensional space in which we move. Nevertheless it is the symmetry whose "insight" is the creation of space-time. Space provides a memory for the increase in entropy "desired" by the creative universe. Eigenvalues and eigenvectors - Wikipedia
6.9: Gravitation: the structure of space-time
The overall structure of the universe is described by general relativity. We can imagine that this structure was one of the first things to form as the universe expanded. We notice that the modern mathematics of general relativity and fundamental particles (described by variations of Yang-Mills theory) is quite similar, Lie groups. Gerardus 't Hooft: 50 Years of Yang Mills Theory
As we noted in chapter 1, the initial singularity enjoys the same three features as the classical god: it exists, it is the source of the universe, and it has no structure. It is, in Aquinas' words, omnino simplex. Hawking and Ellis imagine expansion of the initial singularity to the existing Universe as a time reversed version of the contraction of a black hole. Studies of black holes suggest that although they eventually 'evaporate' for quantum mechanical reasons, the time taken to do so for any reasonable sized black hole is enormous. How, then, can we explain the rapid expansion of the initial singularity implied in the name 'big bang'? Black hole - Wikipedia
The 'big bang' is not an explosion in the ordinary sense of the word, where exploding gases expand rapidly into an already existing space. The big bang describes the increase in size of space-time itself, viewed from the inside. If the Universe is all that there is, there is nothing outside it to expand into. Hard to visualise, but logically consistent! From our point of view, we know that the of space-time is expanding because the distances to distant galaxies are increasing relative to the size of atoms. The size of atoms is determined by the fundamental physical constants such as the mass and charge of electrons and the quantum of action. Atomic radius - Wikipedia
As Newton realized, gravitation is universal, it affects everything in the Universe without exception. Gravitation is not a force in space, like other physical forces, but a property of space itself. We might understand it to be a the property of the formless initial singularity that led it to differentiate itself into the Universe as we know it. Gravitation needs none of the outside help implied in the notion that the Universe is created by a god outside it. How are we to understand this?Physicists, like surveyors, astronomers and builders use reference frames to provide a basis for measurement. Such frames are artificial, however, even though they may be based on fixed points in nature. These may range from pegs driven into the ground to the intersection of the plane of the Earth's equator with the plane of its orbit around the Sun. Because they are artificial, changing frames of reference should not change the measured reality. The algorithms for mapping from one frame of reference to another must leave reality unchanged. First Point of Aries - Wikipedia
Einstein's special theory of relativity is based on the idea that no matter how we are moving when we measure it the velocity of light is constant. To achieve this the transformations between different frames of reference moving at constant velocity ("inertial frames") must follow the Lorentz transformation. Lorentz transformation - Wikipedia
General relativity deals with accelerated motion. The velocity of light is no longer constant and the Lorentz transformation is no longer sufficient to transform away the effects of accelerated frames of reference. Einstein found that he could replace the invariance of the velocity of light with a new standard for reality, the metric which measures the spacetime distance between events. The general theory is a transformation designed so that no matter how we devise coordinates for measuring processes in the universe, distances between events remain unchanged, as they should.
Gravitation, currently falls outside the range of quantum theory. This may suggest that it is embodied in the first generation of fixed points in the dynamics of god. Gravitation conserves entropy, so it behaves very much like the unitary evolution of an unobserved quantum system. Since gravitation sees only energy the exact form of energy is irrelevant to it. Richard Feynman has suggested that the total energy of the Universe could be zero, the gravitational potential being exactly matched by the energy of the particles that fill the universe. Richard Feynman (2002): Feynman Lectures on Gravitation
When asked why it took so long to develop the general theory of relativity, Einstein replied:
The main reason lies in the fact that it is not easy to free oneself from the idea that coordinates must have an immediate metrical meaning.' Sunny Auyang: How Is Quantum Field Theory Possible? (link above)
Here we rely on the fact that an act or event has no metric. It is simply a change of state. In a divine Universe, an act is an act of god. Although we imagine god as the whole of reality and the physical quantum of action is exceedingly small, because they have no metric they can both be identified simply as pure action. A quantum event is one quantum of action, as is the procession of the Son from the Father
Languages are distinguished by the algorithms used to encode and decode them. Gravitation, being a universal language, spoken by every particle in the Universe, would appear to need no such algorithm. Indeed, because it appears to have been operative in the structureless initial singularity, we might consider it to be a null language, that is a string of meaningless identical symbols or acts. This is consistent with the idea that energy is simply a time division multiplexed string of quanta of action.
The mathematical starting point for Einstein's general theory is a differentiable manifold. We might imagine this manifold as little pieces of flat Euclidean spaces, rather like the units of chain mail, hinged together by flexible and elastic joints represented by differentiation. This structure enables us to model the curved and dynamic space which encloses the detailed structure of our Universe. There is no measure of distance in the manifold itself, so the same manifold can represent the whole of space-time, no matter how big or small. It is consistent both with expanding and contracting space-time and with fundamental particles. When Einstein applied this mathematical structure to modelling the Universe he introduced a metric which determines the interval between different points in the Universe. This metric depends on the energy present in each neighbourhood. This scheme of transformation, represented by Einstein's field equation, helps us to understand the large scale structure of the whole universe. Einstein field equations - Wikipedia, Differentiable manifold - Wikipedia
We can make this abstract mathematical model more concrete by imagining each unit of inertial space as a particle with fermionic properties and the relationships of the particles as the messages carried between them by bosons. The energy at any point in this structure is the rate of communication between the particles. The energy measures the 'curvature' of local sections of the network which appears in the formal treatment as the metric of the network space. Because we do not want these communications to interfere with one another, we require the structure to be three dimensional, so that connections can be made without interference.
The Earth, from a relativistic point of view, is a region of high energy or intense communication. It curves the space around itself so we do not fall off. We can see this curvature clearly in the motion of satellites, which are continually falling weightlessly, but nevertheless go round the Earth, following an inertial geodesic. Gravitation becomes perceptible because our normal geodesic flow is interrupted by the presence of the Earth.
Astronomy is one of the oldest sciences, very important for human life. This is indicated by many ancient monuments aligned to celestial phenomena such as the summer and winter solstices and equinoxes. Archaeoastronomy - Wikipedia
Since the maturation, mating, and breeding and of plants and animals are closely related to the annual cycle of the seasons, there is much to be gained from understanding the astronomical sources of this cycle. The heavens also provide us with a frame of reference for navigation. This has evolved from simple visual observations to the precise observations which are possible with current astronomical instruments. Celestial navigation is particularly important for people travelling in featureless environments like deserts and the sea. The notion that the heavens control events on Earth went beyond agriculture to astrological investigations into love and politics. Much early astronomical work was funded by rulers using astronomical observation to seek heavenly guidance. Astrology - Wikipedia
Ancient observers naturally placed the Earth at the centre of the world and imagined the heavens revolving around it. The modern development of astronomy began by turning this upside down. Now we see the heavens as stationary and the Earth as rotating. The next step forward came when it was realized that the Earth is not at the centre of the world, but orbits around the Sun. Once again, many people found this very hard to believe, but is now common knowledge.
Measurement of the solar system was well advanced by the time of Isaac Newton. Newton set out to discover why the planets move as they do, and discovered the law of universal gravitation. Massive bodies, like moons and planets, attract one another with a force proportional to the product of their masses divided by the square of the distance between them. Newton's work supported Galileo's view that mathematics is the natural language of the Universe. Isaac Newton - Wikipedia
Relativity was next big step in understanding our cosmic habitat. The general theory was published in 1916, and immediately solved one old astronomical problem, the precession of the perihelion of Mercury. Albert Einstein - Wikipedia, Jose Wudka: The precession of the perihelion of Mercury
An important development was the use of spectroscopy to identify the composition of the stars. In 1868 spectroscopists discovered the element helium in the Sun before it was discovered on Earth. At about the same time, spectroscopic measurements began to show that the light from distant galaxies was redshifted, meaning that they were moving away from the Earth. Using this information, Edwin Hubble formulated Hubble's law which correlated the redshift of a galaxy with its distance from Earth and suggested that the Universe was expanding. Helium - Wikipedia, Redshift - Wikipedia
Since Hubble's time the advent of computers has made detailed modelling of the large scale structure of the Universe feasible. Very large terrestrial telescopes and space based instruments have enabled us to look back to the early Universe was very young and much smaller than it is now. One of the most useful sources of information about the early Universe is the cosmic microwave background radiation, first discovered in 1964. Cosmic microwave background - Wikipedia, The Illustris Collaboration
The combination of observation and theory has led to widespread acceptance of the history of the universe known as big bang cosmology. This cosmology combines the relativistic model of the large scale structure of the Universe with the quantum field theory of its microscopic structure known as the standard model. Between them, these two models give us a relatively consistent picture of the evolution of the Universe from very shortly after its origin to hundreds of billions of years into the future. Big Bang - Wikipedia, Standard Model - Wikipedia, Peacock: Cosmological Physics
6.10. Cybernetics, algorithms and selection: P & NP
The classical Christian god knows and controls every moment in the Universe. Aquinas explains that God has immediate providence over everything. This is possible because God is both omnipotent and omniscient. How this is possible, given that god is completely simple, is a mystery. This belief is nevertheless a foundation of Christian hope. If god is benevolent toward us, their infinite knowledge and power guarantees that everything is for the best, even though things often look very grim. Accepting this is seen to be a test of faith. Aquinas, Summa, I, 22, 3: Does God have immediate providence over everything?
Here we understand the observable Universe to be the fixed points in the divine dynamics. In the network picture, we understand these fixed points to be messages from god, that is divine revelation. We attempt to understand the relationships between these messages by modelling the underlying dynamics. Our principal tools for this work are quantum theory and communication networks.
Feynman was among the first to realize that there is a correspondence between quantum operators and logical operators. This correspondence is the foundation of quantum information theory, a new and vigorously growing field of research. Developers see two principal advantages in quantum communication. The first is security. The observation of a quantum state causes it to 'collapse' so that any attempt to intercept a message represented as a quantum state breaks the message, thus alerting the communicants to its interception. Nielsen & Chuang: Quantum Computation and Quantum Information
The second is computational power. Proponents of quantum computation believe that because quantum formalism is based on continuous functions in a complete Hilbert space, it can be interpreted as a perfect analogue computer capable of processing infinite superpositions of states simultaneously. Although this may be formally correct, the formal precision of the mathematical model is lost during the measuring process necessary to extract information from the quantum system. This means that from the point of view of actual results, a quantum computation system may have difficulty performing better than a classical turing machine.
The formal arguments for the determinism of continuous functions require limiting processes that move into the transfinite domain of real numbers. Can these formal arguments be realized? Here we take the view that the limits to mathematical determinism implied by Gödel and Turing's results may prevent real continuous analogue computations from being deterministic, so that the uncertainty manifested by quantum observations may also be present in the quantum dynamics that is the source of these observations.
Given our assumption that the Universe is divine, this constraint on determinism would imply that unlike the classical god with complete knowledge and control, the cosmic god is not completely deterministic. The consistency of the divine dynamics opens it to the indeterminism implicit in the results of Gödel and Turing. Here we assume that the turing machine marks the limit on computation. Although there are ℵ1 possible mappings of the natural numbers to themselves, there are only ℵ0 possible Turing computable functions, so most of the mappings of the set of natural numbers (or any equivalent set) are incomputable. This implies a large degree of uncertainty in observable processes, accounting for the uncertainty of the world. Extraordinary engineering and procedural precautions must be taken whenever we wish to establish deterministic systems like accident free air travel.
We get a more detailed insight into the relationship of determinism to uncertainty through cybernetic principle of requisite variety. Gregory Chaitin sees this principle as closely related to Gödel's work on incompleteness. From this point of view, no proof can reach a conclusion containing more information than the hypotheses from which the proof is drawn. This idea holds at all levels of complexity, so that we cannot draw a conclusion comprising n+1 bits of information from n bits of input. Gregory J. Chaitin: Gödel's Theorem and Information (link above)
Chaitin's interpretation of G&0uml;del's theorems shows us the necessary conditions for control in the Universe. This is in a sense a modernization of Aristotle's axiom that no potential can actualize itself, where we use Lonergan's understanding of potency and act: potency means intelligible, act means actually understood. In this case, the possible interpretations of a set of data are effectively infinite until we know the algorithm by which the data were encoded.
The aim of the mathematical theory of communication is to establish deterministic motion from past to future. Because entropy tends to increase, however, the future is generally more complex than the past and so cannot be controlled by it.
Evolution is built upon variation and selection. Variation is possible because some transformations are not computable or controllable. Selection picks out the variations that are consistent with survival and reproduction. Survival and reproduction require control, that is computability. This means we have the situation envisaged by the P – NP problem in the theory of computation. P versus NP problem - Wikipedia
Turing showed that some problems are incomputable, meaning that they cannot be solved by a deterministic process. This does not exclude the possibility that problems may be solved "accidentally" by random processes. This appears to be an important feature of creation. Insofar as creation means introducing structures that have never existed before, it is hard to imagine it being the conclusion of a deterministic logical process.
We imagine the complexification of the universe in terms of Cantor's idea for the generation of transfinite numbers by combining and permuting the natural numbers. The P – NP problem asks whether a solution found by whatever means can be verified by a computable process. Evolution by natural selection suggests that incomputable problems can be solved by random variations and the solutions can be verified by the computable processes of survival and reproduction.
The key to the evolution of living creatures lies in the permanence of the genome. There are many mechanisms in place to ensure the accurate copying of DNA and RNA. Even in creatures that evolve very quickly like viruses, the maximum rate of base changes in the genome seems to be small, about one base per million per generation. Without some sort of memory to preserve "progress so far" it is hard to see how more complex structures can be established by a recursive process of complexification. Rafael Sanjuán et al: Viral Mutation Rates
A solution may lie in the layered structure of computer networks. Although the transfinite model outlined in chapter 5 is based on Cantor's transfinite numbers, transfinity is formally a relative concept, so that we may see 2 as transfinite with respect to 1. If we consider the initial singularity, cardinal 1, as the first layer of the universal network, the second layer may have cardinal 2 and the third cardinal 4 and so on. If we think of computation in terms of clock cycles, the frequency of execution of processes in the higher layers decreases in proportion to their complexity, and they may be said to live longer and so act as memory to preserve the structures of the simpler systems upon which they rely for their existence.
The details of this process of evolution are not clear to me, but, given that the universe did start as a completely simple initial singularity and that it is now exceedingly complex, there must be some process based on the preservation of randomly constructed recursive structures to explain this complexification. The fact that we find group theory widely represented in the structure of the world suggests that grouplike structures, insofar as they are closed and self perpetuating are an important feature of this process.
6.11: Representation: fundamental particles
So far we have created a universe of energy out of the initial singularity of action by the application of the logical not operator to make a wave of sequential actions. The existence of energy enables the existence of gravitation and quantum mechanics, but we have yet to see details which might account for the large scale structure of the Universe, and my guess is that we might find the source of this structure in quantum mechanics. We discussed quantum mechanics in an abstract form in chapter 5, but now we turn to making it real, which raises problems which fall under the general heading of representation.
Historically meaning, knowledge and information have been considered as rather immaterial aspects of the world, but here we agree with Landauer that information is a physical entity and the same goes for knowledge and meaning. To get some grip on the representation problem, we need to look at the philosophical step that marked the difference between Plato and Aristotle.
Many philosophers have tried to understand the nature of the Universe by studying the nature of knowledge. This present project started in the 1960s with Bernard's Lonergan's effort, in Insight, to put Thomistic metaphysics on a new more modern footing. He had to be careful what he said, and not go too close to the boundary of orthodoxy. It was necessary for him to devise a metaphysical model which respected the business plan of his church. This required him to classify knowledge into humanly intelligible proportionate knowledge accessible to science and inaccessible transcendent knowledge, the exclusive intellectual property of his church acquired by direct revelation from god. As required, this model puts the nature of god beyond human comprehension.
The Papacy responded to the "Modernist' Crisis" in the Church with their standard approach of condemning new ideas and reaffirming old ones. In 1864 Pope Pius IX produced the Syllabus of Errors, a condemnation of 80 statements which contradicted Catholic belief. On the positive side his successor, Pius X, advised the Italian schools of theology that the principles and major opinions of Thomas Aquinas should be held conscientiously. This led the schools to draft of a list of 24 Thomistic theses attempting to capture the essence of Thomism. These were approved by the Congregation of Studies as safe directive norms. As far as I know these remain in place and give some substance to the requirement in Canon Law that aspirants to the priesthood be trained in Thomistic theology. Pius X: On the doctrines of the modernists Pascendi dominici gregis, P. Lumbreras: The Twenty-Four Fundamental Theses of Official Catholic Philosophy: Part I, John Paul II (1983): Code of Canon Law: Canon 252: §3.
The key thesis, from our point of view, is number 23, which harks straight back to Plato who considered his immaterial forms the true subject of intellectual knowledge:
Intellectuality necessarily follows immateriality, and in such a manner that the degree of intellectuality is in proportion to the remoteness from matter. The adequate object of intellection is being as such; but the proper object of the human intellect, in the present state of union, is restricted to the essences abstracted from material conditions.
The official philosophy of the Catholic Church holds, in effect, that real reality is immaterial, and that the human soul, since it has intellectual powers, must also be immaterial. The problem here is that immaterial effectively means invisible, since our senses depend upon physical particles of one sort or another to gather information. God, therefore, must be invisible, which completely contradicts the premise upon which our scientific theology is built, that the god is identical to the universe and therefore scientifically observable.
The original answer to this problem comes from Aristotle's modification of Plato's theory of forms. From Plato's point of view, the forms are eternal and immutable. Aristotle accepted this but accommodated change by imagining that change involved the replacement of one form by another in the same matter, as we might, for instance, mould the bronze of a sword into a ploughshare. As a by-product, in effect, forms were represented in a physical guise that opened them to observation and study, so we could learn about horses by studying actual physical horses.
Since Plato's forms were invisible, he had to devise a different theory of knowledge. He imagined that the souls of the unborn once lived in the heaven of forms and came to know them. People were born full of this innate knowledge, even though they did not know this. Plato imagined a methodology which has become known as the Socratic Method for eliciting the knowledge that was believed to already be present in peoples' minds. Platonic epistemology - Wikipedia
Following Parmenides, Plato drew a sharp distinction between knowledge, which was based on the forms, and opinion which was derived from interacting with the world and its people. This idea eventually filtered into Gnosticism, where the goal was to achieve knowledge of the supreme divinity through some form of mystical or esoteric insight.
The Platonic influence is very strong in some approaches to quantum field theory. Some hold that the invisible fields, analogous to the Platonic ideas are the real reality and the particles that we actually observe are of lesser importance. I have already quoted the philosopher Auyang:
According to the current standard model of elementary particle physics based on quantum field theory the fundamental ontology of the world is a set of interacting fields.
Although this system works quite well it leads to problems of infinity and ludicrously large estimates for a number of measured physical parameters which fall under the general category of "cosmological constant problems".
Silvan Schweber summarizes the sources of the infinities:
These difficulties stem from the fact that (i) the description is in term of local fields (i.e., fields that are defined at a point in space-time), which are systems with an infinite number of degrees of freedom, and (ii) the interaction between fields is assumed to be local. . . .
In QED the local interaction terms imply that photons will couple with (virtual) electron-positron pairs of arbitrarily high momenta and result in divergences, and similarly electrons and positrons will couple with (virtual) photons of arbitrary high momenta and give rise to divergences. These problems impeded progress throughout the 1930s, and most of the workers in the field doubted the correctness of QFT in view of these divergence difficulties. Silvan S. Schweber: The sources of Schwinger's Green's functions
The infinity problems were ultimately removed or hidden in the late 40s by renormalization, which opened the way for a theory of quantum electrodynamics which has yielded theoretical results indistinguishable for the best available measurement. It seems unlikely that this is the last word, however, since the theory continues to yield ludicrous results. Frank Wilczek, one of the Nobel prizewinners for the development of quantum chromodynamics sings the praises of the new theory in his book on the subject, but has to admit, in very small type, that various computations yield results which are 1044, 1056 and 10112 times greater than observation. These figures represent the greatest differences yet between physical calculations and reality and point to some difficulty with the calculations. Frank Wilczek: The Lightness of Being: Mass, Ether, and the Unification of Forces
Insight is considered to be an act of intelligence and we are inclined to think of ourselves as the only really intelligent species on the planet. Here, however we see insight and intelligence at all levels of the Universe. We have already speculated about the origin of time and space to give us 4-dimensional space-time. The first step, from pure action to time and energy lays the foundations for quantum mechanics.
6.12. Quantum communication: bosons connect fermion
Although our theories of fundamental particles raise many questions, there is no doubt about their existence and behaviour and we have seen an explosion in technology arising from the application of quantum systems. These technological advances have improved our understanding of the world at all scales from fundamental particles through genetics and molecular physiology to the structure of the universe.General relativity is one of the most amazing results of the application of continuous mathematics and calculus. At present no quantum field theory of gravitation has been created because it has been found impossible to use renormalization to eliminate the infinites that appear in quantum theory of gravity. Quantum field theories are an attempt to unite quantum theory and special relativity. Trouble in this union starts at the very beginning because these theories have a foot in each of two very different spaces. Hilbert space is the natural home of quantum theory and has next to nothing to do with classical Minkowski space-time; Minkowski space, on the other hand, is the home of special relativity. The central problem is that quantum theory does not provide a mechanism for the creation and annihilation of discrete particles, while special relativity, by establishing the same relationship between momentum and energy as there is between space and time interprets all motion as a process of creation and annihilation.
The usual approach to resolving this dilemma is to assume that space-time is the domain of Hilbert space, so that the Lorentz transformations we use to relate systems in relative inertial motion are also applied to the Hilbert spaces that describe these systems. In 6.6 we imagined that spacetime may be an application of quantum mechanics. This suggests that transforming Hilbert spaces with Lorentz transformations may be placing the cart before the horse. If there is to be a quantum theory of gravitation, we might expect quantum mechanics to be in place before space time structure emerges.
6.13: Network QFT, QED and QCD
The most common representation of quantum mechanics is Erwin Schrödinger's wave mechanics. Historically this was the second successful version of quantum mechanics. The first matrix mechanics was published by Heisenberg, Born and Jordan slightly before Schrödinger. It was soon found that both were different mathematical representations of the same idea. The work was continued and perfected by Dirac's transformation theory and von Neumann tidied up the mathematics by using abstract Hilbert space as the domain for quantum mechanics. Schrödinger equation - Wikipedia, Matrix mechanics - Wikipedia, Paul Dirac: The Principles of Quantum Mechanics, John von Neumann: The Mathematical Foundations of Quantum Mechanics (link above)
The various representations of quantum mechanics listed here point to a particular difficulty with the theory. It differs from classical Newtonian mechanics on three points. First it is mostly about particles that are too small to see, whereas classical mechanics deals with planets, moons and other macroscopic objects; second classical mechanics provides clear connections between its mathematical model and physical realities like positions in space, velocities and masses, whereas quantum states are described by vectors in a space with any number of dimensions from one to countable infinity; and third, the inner workings of quantum mechanics are mostly expressed in complex numbers which do not directly correspond to anything observable.
Space adds to time the possibility of two or more things existing at once, spatially separated. Whereas it would be a local contradiction for p and not-p to exist at the same place at the same time, they can both exist at the same time in different places. Traditional theology denies that God is a body which occupies space because they understand that space is infinitely divisible and therefore potential, contradicting the nature of god, which is pure act. The logical definition of space proposed here does not imply potential in the ancient sense, and so is in no way inconsistent with divinity. Aquinas, Summa: I, 3, 1: Is God a body
Nor is space a passive container, as the ancients imagined. It is the home of momentum, and momentum and distance relate to one another very much like energy and time. Whereas energy is the rate of action in time, momentum is the rate of action in space, so space and time are connected by action through the velocity of light. We think that energy came first as as action bifurcated into energy and time, now energy time bifurcates again into momentum and space. Spacetime and energy momentum are mathematically represented in physics in exactly the same way, pivoting around the fundamental metric of the universe which is action.
Descartes made a good start when he noted that clear and distinct ideas are a criterion of truth. In my Scholastic days, one of the watchwords was opportet distinguere: it is necessary to distinguish the different meanings of term to get a clear ideas of what we are talking about. Much of the progress in science can be attributed to sharpening our language as we confront the unbelievable complexity of the universal structure, detailed events stretching right down to the quantum of action. A fly's footprint involves the coordination of trillions of trillions of quanta of action.
The biggest problem facing physics for the last century has been the interface between quantum mechanics and relativistic interface of energy and momentum with time and space. This has led to persistent problem with the appearance of infinities in the mathematics which are very unlikely to actually exist in reality. In his Nobel lecture Feynman characterized renormalization, the technique devised to remove these unlikely infinities, as sweeping problems under the rug:
I don’t think we have a completely satisfactory relativistic quantum-mechanical model, even one that doesn’t agree with nature, but, at least, agrees with the logic that the sum of probability of all alternatives has to be 100%. Therefore, I think that the renormalization theory is simply a way to sweep the difficulties of the divergences of electrodynamics under the rug. I am, of course, not sure of that.Here, I feel, we owe clarity once again Feynman, who was among the first to realize that quantum mechanics has very little to do with space-time physics and is in fact a description of the substratum of computation that makes the world go round. Misner Thorne and Wheeler touched on this idea with the "pregeometry" and in the last few decades we seen an the explosion of interest in quantum computation, still somewhat muddied by its origins in the continuous mathematics if classical physics. Richard P. Feynman: Nobel Lecture: The Development of the Space-Time View of Quantum Electrodynamics, Misner, Thorne & Wheeler: Gravitation
Renormalization has become so central to field theory that it has become one of the criteria for a valid theory. Current theory sees gravitation as unremormalizable, so it remains outside the standard model. Huang points out the the rot set in when people began to use continuous mathematics to compute the self energy of the electron. Wilson identified the problem as a matter of scale. Here, since we have nothing to lose, we reject continuous mathematics and replace it with quantized action and base our picture on the scale invariance of communication networks. More detailed discussion would take us too far afield for this exploratory essay. Kerson Huang: A Critical History of Renormalization, Kenneth G Wilson: Nobel Lecture: The Renormalisation Group and Critical Phenomena
Let us therefore assume that non-relativistic quantum mechanics describes the logical and computational layer of the universe that emerged on the foundation of energy and time and is itself the foundation of space-time. We have already noted that the foundation of a practical computer is the clock that keeps everything in synchronization, and that the clock itself in in fact a rudimentary computer, executing a string of nots which create a wavelike structure which we can model with a qubit |ψ> = cos(t)|0> + i sin(t)|1> normalized by the fact that cos2(t) - i sin2(t) = 1. Nielsen & Chuang: Quantum Computation and Quantum Information (ref above)
From a picture of the mathematical community (6.3) we my conclude that the real action occurs inside the mathematicians. Scale invariance suggests that the same can be said for fundamental particles, quanta of action embodying computational processes that determine their behaviour when communicating with one another in the cosmic network. We may say the mathematicians are fermions. All their words, questions, papers, books, speeches and videos fall under the heading of bosons. The action goes on inside the fermions and they communicate with one another via the bosons. The principal property of fermions is that they stand aloof and protect their integrity while bosons are happy to share their space. We may see this as the quantum mechanical source of 4D spacetime and the root of gravitation.
Although fundamental particles have zero size they are nevertheless capable of representing computations by logical confinement which is a consequence of fixed point theory. Each particle is an image of the universe, closed, logically continuous, convex and animated by a quantum of action.
Quantum mechanics works well in low energy situations where, apart from photons, particles are moving slowly compared to the velocity if light. Apart from photons (once more) the particles involved can be considered as permanent structures. As velocities and energies go up, however things change. First, since mass and energy are effectively identical, massive particles can be created and annihilated; and second, there is a complex issue of virtual particles, that is unobservable particles which even at relatively low energies can be created and annihilated within a quantum pixel because this is allowed by Heisenberg's uncertainty principle as long as their energy ΔE is inversely proportional to their lifetime ΔE so that ΔE × ΔE ≈ h, where h is Planck's constant. Uncertainty principle - Wikipedia
A further problem arises because it is believed that no causal influence can travel through space at a velocity greater than light so that the actual causal interactions of quantum systems are local, assumed to occur between fundamental particles of zero size in contact with one another. This interaction is explained by quantum field theory, which is worked out in the continuous complex space of quantum fields. These fields are mathematical functions built on the foundation of continuous spacetime.
The finite size of the quantum of action and the relationships between energy, time, distance and momentum mean that if we want to see processes that occur very quickly in very small regions of space-time, we must observe them at very high energy and momentum. This is why the physics community has been driven over the last century to devise machinery operating at ever greater levels of energy and momentum to observe particles at the smallest possible scale. These machines are in effect giant microscopes, some many kilometres in size.
Although this machinery works by the laws of nature and is capable of imitating conditions inside the sun, and even very close to the first moments if the expanding universe, the mathematical theory has problems with infinity. Although the universe clearly works as a consistent whole, we cannot yet say the same about our mathematical models of the universe. It is widely recognised that the theory we have so far is a temporary expedient en route to a complete theory of the fundamental structure of the world. We know enough, however, to live in an era of explosive development of new technologies based on quantum theory.
6.14: Corruption and death
Aristotle and his contemporaries thought that the intellectual element of a human being is immortal or eternal. This is because they thought of matter as something inert and incapable of understanding, so that the understanding part of our minds must be spiritual. They thought that spiritual beings, because they had no parts, were incapable of coming apart, and so everlasting.
Quite reasonable, really, but impossible and so wrong.
The transfinite numbers explain imagination and variation. From a formal mathematical point of view we find no problem imagining the infinite set of natural numbers and the transfinite sets that are built on them. We observe in reality, however, that all information is represented physically and the physical world is very complex, a veritable haystack of signals travelling between billions of sources. Where signals cross they may become corrupted. In this case sources that rely on the reliable signalling may lose contact with one another, leading to a breakdown in the bond between them. In the darker realms of the human world, bad agents may do what they can to corrupt signals to break up human relationships, businesses or political parties. The internet has provided extensive new opportunities for such activity, often allowing perpetrators to hide themselves by encryption and other means. Propaganda - Wikipedia
In living organisms many structural errors can be corrected by the death and recycling of damaged cells and the reproduction of pristine replacements. This does not apply to all tissues however, and so we see the effects of ageing in our skin and hair and feel them in increasing stiffness and weakness of our bodies. These natural processes are supplemented by disease and accident so that our ultimate fate is death through fatal error of one sort or another. Ancient dreams of eternal life do not have a place in modern biology.
6.15: Evil
The President of the United States, Ronald Reagan called the Soviet Union an evil empire, while at the same time trying to encourage Gorbachev and the Soviet leadership to liberalize their regime and convert to the American way of life. His words were another step in the desire long held to free Soviet citizens from oppression, rather as American liberals had ultimately hoped free the slaves.
Deception is one of the important tools of evolutionary survival. Plants and animals often use it. Among the functions of deception are both escape from predation and the capture of prey. From the point of view of a hunter, a successful hunt is a good; from the point of view of the prey, it is an evil. Good and evil are thus indissolubly linked in the biological world.
The network model suggests that there are two roots of evil. One lies lies in the small size of the computable space when compared to all possible space. We may say that the computable space is full, and that this fullness is represented by the conservation of energy through time. Quantum mechanics identifies energy with frequency, and here we like to link frequency and processing rate. As a consequence of the conservation of energy we have the conservation of matter.
When we are in computable space, that is in space that we can control, evil can be avoided. This is the fundamental rule of road safety: maintain control of the vehicle. The same principle applies to occupational health and safety: we seek to have no uncontrolled events ('accidents') in the workplace. Ideally, there is a tested procedure for every action and enough control to make sure things proceed as planned.
Since all information is represented physically, the availability of the physical resource necessary to represent the information places a limitation on the processing power systems. So we find in practical computers limitations on physical memory and processing speed. On the other hand, a person needs a certain amount of nutrition and a certain amount of body mass to survive. From this point of view, we may see all evil as a form of starvation, insufficient resources being available to prevent error.
The second source of evil lies in the hierarchical structure in the universe. In general lower layers are more energetic and violent that higher layers. A bullet tearing through flesh is acting naturally for a bullet, a high energy lump of metal, but it is doing evil from the point view of a living body. This source of evil is particularly important in the general area of occupational health and safety since a large proportion of our industrial processes, require high energy, high pressure, work at great height and motion at great speeds, all of which can get out of hand.
Most animals kill other animals and plants to eat, but humans, with our superior intelligence, have learnt to kill for more sophisticated reasons. So farmers kill weeds, insect and animal pests. Raiders and warlords, on the other hand, are inclined to kill farmers to take their food and land. With improvements in weapons and methods of war, some warlords have been able to establish hegemony over large numbers of people and ultimately establish large empires. Much of recorded human history is the history of the wars of warlords, kings and emperors.
We may place evils on a spectrum ranging from what we might call natural evils like those implicit in survival by hunting and gathering, disease and natural disasters, to human specific evils that run from lethal domestic violence to full scale war on the one hand and mild environmental modifications to wholesale destruction of ecosystems on the other. We will turn to human evil in chapter 9.
Bad things die out, although this may take along time, as we watch the slow reduction of poverty and disease in the world. This reduction is often impeded by the selfish behaviour of individuals and groups in the world who would prefer to destroy rather than to build for political reasons. We turn to politics in chapter 8.16. Does the model fit the world?
Many feel that the world was eternal: is has always been like this and the question of its origin is meaningless. Others feel that it had a beginning, and are compelled therefore to wonder how it came to be this way. The Christian tradition answers this question with a simple hypothesis: there is a narcissistic omniscient and omnipotent being who created this universe and populated it with intelligent beings that would bring glory to their creator by worshipping them. Although this idea is very durable and was probably motivated by the existence of narcissistic warlords who were likely to kill those who did not bow low enough, it may not seem very plausible.
The big bang hypothesis does not seem much better. There had to be something to go bang and since it is somewhat plausible that space and time began with the big bang, the prior state must have existed without space and time and so, like the god of Aquinas, exist without the necessary structure to be either omniscient or omnipotent.
The assumption in the big bang hypothesis that the precursor of the universe comprised all the energy of the current universe raises two further difficulties. First, can we make any sense of an entity of zero size containing a huge amount of energy, implying that its energy density is infinite? This is a generic problem for modern physics, since all the proposed fundamental particles also have zero size and finite energy, implying infinite energy density. The second problem is that since energy is conjugate to time, the initial entity might have had no size, but like all the energetic fundamental particles, did exist in time.
High energy physicists use their machinery to create energy bubbles of finite size which decay into fundamental particles. Millions of experimental collisions have shown that there is a limited spectrum of these particles and they all have well defined properties. How are we to account for all this structure arising from what we take to be a more or less structureless blob of energy?
Much of the information we have about the relationships of particles to one another is consistent with the mathematical theory of groups, but we also rely on a lot of unexplained empirical information about the masses, charges and other features of these particles. It is clear that we yet have a lot to learn.
I see this essay so far as a string of obiter dicta (Latin for travelogue) written while wandering around in the space of problems raised by the relationship between god and physics. We will now turn to exploring whatever lessons have been learnt on this journey to our own roles as particles in a divine universe. We know that we came to be the way we are by a very long process of evolution. It may be that the nature of the fundamental constituents of the universe will ultimately be explained by a similar process.
(Revised 24 December 2020)