scientific theology

This book is part of a project to develop a new scientific and democratic foundation for a catholic theology to replace current theological fictions

Contact us: Click to email
Scientific theology: a new history of creation

Chapter 6: Creating the divine world

6.1: Fitting physics to theology
6.2: Clues from the ancient god
6.3: The mathematical community
6.4: Divinity to trinity
6.5: Physical evolution 1: action to energy
6.6: Physical evolution 2: energy to entropy
6.7: Quantum theory and the creation of space-time
6.8: Measurement, creation and insight
6.9: Gravitation: the network structure of space-time
6.10: Cybernetics, algorithms and selection P & NP
6.11: Representation: fundamental particles
6.12: Quantum communication: bosons connect fermions
6.13: Network QFT, QED and QCD
6.14: Corruption and death
6.15: Evil
6.16: Does the model fit the world?

6.1: Fitting physics to theology

If we are to have a divine world, we must fit physics to theology. At present theology is built on the ancient Greek picture of the physical world and physics is based on the medieval theology of a world created by an omniscient and omnipotent god. Both pictures are in need of the revision that will come from bringing them into contact with one another. In both cases the central problem is that they overlook the fact that information is not just something spiritual or metaphysical, it is a physical entity. It is represented by physical entities of all sizes and complexities from fundamental particles to the universe. The Bible was once made of ink and paper, now it may be encoded as a collection of little magnets, or electrons. Rolf Landauer: Information is a physical entity

In theology the model of god developed by Aquinas and the medieval scholastics has not changed for 800 years. If anything the attempt to understand god has been abandoned in the face of the claim that god is so far beyond our ken that we can say nothing about them. This position was strengthened by the protestant revolution that divided the Christian Churches in the sixteenth century. Theology turned away from developing the science of god and turned instead to interpretation of the Bible, which is largely a political, poetic and cultural document with very little to say to say about the reality of god or the world.

Aquinas, like most of the ancients, equated knowledge with immateriality, that is spirituality. Is there knowledge in God? Yes, he says, arguing from Aristotle’s doctrine of matter and form:

In God there exists the most perfect knowledge. To prove this, we must note that intelligent beings are distinguished from non-intelligent beings in that the latter possess only their own form; whereas the intelligent being is naturally adapted to have also the form of some other thing; for the idea of the thing known is in the knower. Hence it is manifest that the nature of a non-intelligent being is more contracted and limited; whereas the nature of intelligent beings has a greater amplitude and extension; therefore the Philosopher says (De Anima iii:8, 431b20 ) that "the soul is in a sense all things." Now the contraction of the form comes from the matter. Hence, . . . forms according as they are the more immaterial, approach more nearly to a kind of infinity. Therefore it is clear that the immateriality of a thing is the reason why it is cognitive; and according to the mode of immateriality is the mode of knowledge. Hence it is said in De Anima ii that plants do not know, because they are wholly material. But sense is cognitive because it can receive images free from matter, and the intellect is still further cognitive, because it is more separated from matter and unmixed, as said in De Anima iii. Since therefore God is in the highest degree of immateriality . . . it follows that He occupies the highest place in knowledge Aquinas, Summa: I, 14, 1: Is there knowledge in God?.

From a modern point of view, this discussion completely misses the point. It is a general metaphysical argument which does not touch on the mechanism of knowledge. Aristotle does come close when he defines the soul as the first actuality of natural body possessed of organs. Nevertheless, he felt that the universality of human intellectual knowledge required a spiritual mind. Christopher Shields: Aristotle, Christopher Shields 1996: The Active Mind of De Anima III 5

Aristotle was not to know that the organs of a living body are constructed on a microscopic molecular scale that only became visible with the advent of electron microscopes. Indeed the structure of the world continues well beyond the reach of microscopes into the atomic nucleus whose scale is typically millions of times smaller than any molecule. Every element of this structure may serve as a representative vehicle, so that the Universe has vast information carrying capacity..

In physics, the representation problem emerged with the development of quantum theory. Since Galileo's time, the importance of mathematics as the route to physics has grown. Isaac Newton used the power of mathematics to explain the role of gravitation in the motion of the Solar System. The application of mathematics in classical physics is relatively straightforward, since it is easy enough to correlate mathematical symbols with physical variables like number, time, mass, energy, momentum and so on.

The difficulty arises when we turn to quantum mechanics. Sunny Auyang writes:

According to the current standard model of elementary particle physics based on quantum field theory the fundamental ontology of the world is a set of interacting fields. Two types of field are distinguished, matter fields and interaction fields. . . .. The quanta of matter fields, called fermions, have half integral spins. The electron is a fermion; its spin quantum number is ½. . . .. The quanta of interaction fields, called bosons have integral spins. The photon is a boson. Its spin quantum number is 1. . . .. Fermion - Wikipedia, Boson - Wikipedia

There are 12 matter fields and each has an antifield. . . ..

There are four fundamental interactions. Gravity . . .. Electromagnetism . . .. The strong interaction . . .. The weak interaction . . .. Sunny Auyang: How is Quantum Field Theory Possible? page 45. Gravitation - Wikipedia, Electromagnetism - Wikipedia, Strong interaction - Wikipedia, Weak interaction - Wikipedia

If "the fundamental ontology of the world is a set of interacting fields", we must ask how are these field represented? Auyang's statement appears to imply that the fields are prior to the particles. This raises the "measurement problem" present almost since the beginning in the interpretation of quantum mechanics. Measurement in quantum mechanics - Wikipedia

Mathematicians and physicists represent their data, their thought and discoveries in "the literature", the physically observable set of marks which represent everything publicly shared by the community.

An important and very succinct piece of mathematical text in quantum mechanics is the Schrödinger equation:

ih∂/∂t |ψ(t)> = H |ψ(t)>

Here H is the Hamiltonian or energy operator which transforms the ket |ψ(t)> as time goes by. As long as the system is undisturbed this transformation is considered to be deterministic. This transformation is unitary, equivalent in communication terms to a lossless (reversible) codec. |ψ(t)> is a vector in a Hilbert space formed by the superposition of the complete (possibly infinite) set of basis states of the system. Codec - Wikipedia

Following Michael Polanyi it seems appropriate to call the Hilbert space which serves as the domain of the Schrödinger equation its tacit dimension. We introduced this idea in chapter 2.1. Is there a real infinite dimensional physical object corresponding to the Hilbert space which acts as the tacit dimension of quantum mechanics?

Many physicists accept the "Copenhagen interpretation" which treats the complete set of basis basis states that contribute to a ket as real, and then postulates something like either the "collapse of the wave function" or the "many worlds model" to account for the fact that an observation reveals only one of the states that constitute the ket. My philosophical problem is that these states seem to have only an abstract mathematical or platonic existence because there is no observable representative vehicle for them. Insofar as the basis states are all orthogonal, it does not seems possible that nature can represent such an infinity of real states (particles) in a system as small as an atom, or the even smaller nucleus of an atom. Copenhagen interpretation - Wikipedia, Wave function collapse - Wikipedia, Many-worlds interpretation - Wikipedia

This difficulty leads me to think that all the information in a quantum system is embedded in the real particles rather than in an abstract wave function or field.

It took quantum theory nearly 50 years from Planck's initial discovery of the quantum in 1900 to the clarification of the interface between quantum mechanics and special relativity in the late 1940's. First came the "old quantum mechanics" set in motion with Bohr's model of the hydrogen atom. Bohr's idea failed in the face of more detailed data, but in the 1920s two solutions emerged, matrix mechanics and wave mechanics. They turned out to be equivalent and the situation was clarified when Dirac summarized the quantum story with his transformation theory and the Dirac equation. John von Neumann tidied up the mathematics by introducing abstract Hilbert space as the ideal domain of the theory. Bohr model - Wikipedia

The next step in the reconciliation quantum mechanics and special relativity initially proved difficult because it yielded infinite answers, but a system for removing the infinities known as renormalization was invented, so that now quantum electrodynamic calculations match experimental measurements accurate to parts per billion. Renormalization - Wikipedia

The appearance of infinities in attempts to interface special relativity with quantum theory and the need for renormalization point to another problem. It seems very unlikely that the infinities appearing in quantum field theory are really present in nature. Instead they suggest some problem in the mathematical approach to this work.

There seem to be two sources of this problem, both carried over from the use of continuous mathematics in classical physics. The first is that classical physics treats particles as structureless points all of whose properties arise from their relationship to spacetime through such parameters as position, momentum, energy and time. The spacetime manifold is treated as continuous and differentiable and sets of n particles with six degrees of freedom each (position and momentum) are described by phase and configuration spaces of 6n dimensions.

The second is that when point particles are endowed with field properties the self energy of these particles blows up to infinity as their size approaches zero. If such a particle is supposed to have mass or energy its energy density is also assumed to be infinite, another physically improbable feature. The classical model of the initial singularity suffers from this also: the energy of the universe is understood to be contained in a region of zero volume. We avoid this problem here by considering the initial singularity to be a quantum of action, not energy.

We might evade these distasteful consequences by considering the quantum of action to be a discrete real particle whose existence does not imply the existence of continuous space-time. Let us therefore assume that the quantum is the representation of an event or process which can be modelled as a computation. The simplest such computation is a a null operation, formally identical to an error free communication channel. The most complex lie at the limit of computability defined by the turing machine.

Heisenberg approached this problem in 1925 when he wrote: 'The present paper seeks to establish a basis for theoretical quantum mechanics founded exclusively upon relationships between quantites which in principle are observable.' The problem here, of course, is that we cannot observe the processes represented by the mathematical machinery of quantum mechanics, only those that are represented by measurable particles in classical space-time. Scientific theology, since it is founded on quantum theory, has a similar problem. We can observable the classical fixed points of god but the underlying dynamics is hidden from us. Werner Heisenberg: Quantum-theoretical re-interpretation of kinematic and mechanical relations

This invisibility does not mean that the interactions of particles are not real. As explained in section 5.12, very simple systems may be invisible because they do not have the power to do their work and explain themselves simultaneously.

Hawking and Ellis, studying the large scale structure of the universe implied by Einstein's general theory of relativity, concluded that the universe may have started from a structureless initial singularity, which, like the Christian God, is absolutely simple. This provides us with an opportunity to identify the source of the world with the Christian god. Both god and the initial singularity share just three properties, they exist, they are absolutely simple and they are the source of the universe. Hawking & Ellis: The Large Scale Structure of Space-Time

Back to top

6.2: Clues from the ancient god

The idea that information is represented physically implies that real particles are the natural vehicles of information. Our starting point is the initial singularity. The god of Genesis created the universe itself by saying a series of words: "let there be light". We may understand the traditional creation as an example of intelligent design. God had a plan in mind for the universe and realized this plan outside themself, as architects and builders do.

This raises a difficulty. It is a principle in classical theology that attributes which are accidental in the created word are substantial in god. Ideas that are accidental in human minds are understood to be real in god's mind. As we will see when we come to discuss the Trinity, the Father's image of themself is understood to be the real person of the Son. So we might think god's idea of the universe is already real within god, and cannot meaningfully be duplicated outside god. Here we take this idea seriously. The divine universe we propose grows within god, not outside it. Aquinas, Summa I, 15, 1: Are there are ideas?

Although the theology proposed here is radically different from the Christian story, we can still learn a lot from the picture of god developed by Aquinas. By identifying the initial singularity implicit in general relativity with the completely simple traditional divinity of pure action, we can adapt many ideas from the old theology to the new.

Our starting point is identical to the classical god of Aquinas, absolutely simple. The end point is the god of imagination, magnificent, infinite, omniscient and omnipotent, the universe that we see. The path between these two images of god is mapped by the network model outline in chapter 5.

We have two principal avenues to understanding the physical universe: gravitation and quantum theory. General relativity describes gravitation which controls the large scale structure of the universe and so far matches the observed data perfectly. Unfortunately it is incompatible with the current form of quantum theory. Although general relativity tells us what is happening, it does not tell us why. General relativity - Wikipedia

The microscopic local properties of quantum theory lie at the beginning of the chain of explanation that leads us to a comprehensive explanation of the universe as a whole. This idea is implicit in the network model, which is symmetrical with respect to complexity. The same is true of quantum theory. Its structure remains the same whether we are dealing with two states or an infinity.

This feature enables us to move up and down the scale of complexity carrying understanding obtained at one level to other levels. The overall metaphysical foundation for this idea I call cognitive cosmology: it makes the best sense to think of the universe as a mind. From this point of view our intelligence is nothing special. Each of us is a local version of the enormous intelligence that created us and our world.

Thomas Aquinas begins his Summa Theologiae with the claim that, starting from our observations of the world we can prove the existence of god. Aquinas is not so much proving that god exists as that god is not the world. Given his Christian belief that god created the world, it must have been obvious to him that the creator existed. As far as I know, he never raised the difficulty of the duplication of the world raised above. Thomas Aquinas, Summa, I, 2, 3: Does God exist

Aquinas's first way is almost identical to Aristotle's argument for the existence of a first unmoved mover. Motion, he says, is the transformation from potentiality to actuality. Aristotle holds, as an unproven axiom, that no potentiality can actualize itself, but it must be moved by an actual mover. To prevent an infinite regress which would imply that nothing can move he must therefore postulate a being that is the source of its own action: pure action with no potential whatsoever, the realization of all possibility. Aristotle: Metaphysics XII, vii, 9: 1072 b 25 sqq

Aristotle and Aquinas both conclude from their arguments that god is pure actuality (Latin actus purus Greek: energeia or entelecheia). In more modern terms, we may say god is pure dynamics. In Christian terms, god is a living god. Both Aristotle and Aquinas define life as self motion, and Christianity accepts this idea, that god is the source of its own activity. In modern physics, potential and actual energy are identically energy, freely interconvertible, as we see in the pendulum. Aristotle's axiom no longer holds.

The first thing Aquinas denies of god is complexity. Working from the idea that god is pure actuality, Aquinas concludes that god is absolutely simple, omnino simplex. Aquinas, Summa, I, 3, 7: Is God altogether simple?

This brings up against a mystical conception of god which presents another obstacle to identifying god and the world. Having proved that god exists, Thomas then goes on to discuss the nature of god. Following ancient tradition, Aquinas believes that god is so far beyond human comprehension that we cannot say what god is, only what god is not. This ancient idea is generally known as the via negativa, the way of negation. The absolute simplicity of god appears to leave nothing for our minds to get a grip on. Apophatic theology - Wikipedia

A possible solution is provided by the mathematical discovery that logically consistent dynamic systems have fixed points. Few would doubt that god, whatever it is, is logically consistent. This belief is the foundation of the Christian idea that god, in their mysterious ways, always acts for the best, although this may not be immediately apparent to us. Power (philosophy) - Wikipedia

From a mathematical point of view, a dynamic system that maps onto itself can have fixed points. As a wheel goes round, for instance, the piece of wheel that was at one point at one moment is to be found at another in the next moment, while remaining within the wheel. Fixed point theory predicts that there will be one point which does not move. In the case of a rotating mathematically perfect wheel, this is the centre of the wheel.

On the assumption that god is all there is (which here we assume to be true by definition) all motion is within god, and so we can expect fixed points. The mathematical proofs of fixed point theorems are often non-constructive, that is they follow the via negativa by showing that a logical contradiction would arise were the theorem false. Such proofs probably apply to god. Fixed point theorem - Wikipedia

This leads us to three conclusions. First, motion and stillness are not opposed, as the ancients supposed. Stillness, the fixed points, are just as much part of the motion as the moving points. The fixed points of the world, like ourselves, trees, atoms and stars, which we observe as components of a complex structure, are part of the divine dynamics.

Second, since the fixed points which we observe are divine, they are revelation of god. Instead of having to rely on a few ancient texts and institutions that claim to know the true meaning of these texts, everything we observe in the world and every one of our experiences is experience of god. This revelation serves an an enormous and invincible empirical foundation for scientific theology.

Third, we may understand a dynamical system in action to be simple insofar as it is seamless motion. Our own bodies comprise trillion upon trillions of tiny parts which we think nothing of unless they are damaged or in pain. Although the world appears to have many fixed points like trees, mountains and stars, all these structures are ultimately part of a dynamic whole, some moving faster than others. Although I might think of myself as a person or thing, I am better described as an event, lasting about a hundred years from my birth to my death. Brouwer fixed point theorem - Wikipedia

This theory of fixed points solves another old problem. It is commonly believed that the Christian god is omniscient, which in practical terms means that they have (or are) a database containing every bit of information about the universe from the moment it was created until it ends, if ever. A database, like any script, is a set of marks or fixed points. We noticed in section 5.4 that error free communication requires the use of discrete symbols. The fixed points serve as the discrete points which represent the mind of god. The traditional god is immutable and eternal, having but one form. The fixed points in the divine universe, on the other hand, are eternal as long as they last, but can be annihilated and new ones created. There is much more information in the life of a dynamic god than an immutable one.

Galileo's proscription by the Church marked a turning point in the relationship of scientists to politicians. Galileo lost his battle with the hierarchy of the Roman Catholic Church, but science has come a long way since then and its benefits are indisputable. It is, in a divine universe, an element of theology, and the ancients were right when they felt that a good relationship with the divine environment is the key to prosperity. While many politicians find scientific reality distasteful and would like to muzzle it, many voters can see its enormous benefits and vote to fund the sciences.

Science, like justice, is a continual fight against vested political (including financial) interests. This is why dictators have a penchant for imprisoning and murdering scientists and judges first, as an example to the others. If they do not have the stomach for murder, secrecy and threats are pretty good substitutes. There is nothing worse for a promising career than being forced to become a whistleblower against the industry where one is seeking employment. Karim Shaheen: March for Turkey's jailed judges highights purge on dissidents

The Catholic Church and its theologians have managed to defend the core their theology from the inroads of science, so we are still living by theological ideas that are two thousand years old. One reason for this is that the Church has convinced many people that science is not even relevant to theology. They champion the idea of two truths, one arising from observation of the world, the other from special revelation by god. This is the message of Pope John Paul II's address to the Pontifical Academy of Sciences:

How do the conclusions reached by the various scientific disciplines coincide with those contained in the message of Revelation? And if, at first sight, there are apparent contradictions, in what direction do we look for their solution? We know, in fact, that truth cannot contradict truth. . . .

Consequently, theories of evolution which, in accordance with the philosophies inspiring them, consider the mind as emerging from the forces of living matter, or as a mere epiphenomenon of this matter, are incompatible with the truth about man. Nor are they able to ground the dignity of the person.

That is, the truth established by the ancient magisterium of the Church takes precedence over anything science might come up with. Of course, given the hypothesis of this book, there is no issue. The Universe is divine and humanity obviously participates in divinity and has evolved in the image of god. Science is part of true theology. John Paul II: Truth cannot contradict truth, Magisterium - Wikipedia

The network model discussed in the previous chapter is an abstract formal structure. It is in no way constrained by the fact that in reality all information is encoded physically. So we imagine the countably infinite sets of natural numbers and Turing machines, and use them to generate the uncountably infinite transfinite hierarchy of ever more complex and numerous symbols.

The term infinity does not imply the existence of actually infinite quantities. It serves as a catchall upper bound which tells us that such processes as adding one to an existing number can forever without leading to contradiction.

The next step is to apply the network model to the world of our experience which seems, at first sight, very much smaller than a transfinite network. This is not the case. Mathematics creates formal frames of reference which we use to talk about the world, rather like Descartes' coordinate geometry. Nature does not use frames of reference. Like god, it just is. Every process is local, controlled by contact with its environment, not by reference to some abstract plan which some, like Isaac Newton, believe exists in the mind of god. A big problem for Einstein in formulating the general theory of relativity was to understand how to make a picture of the world that was independent of any system of coordinates.

An important feature of the network model is that is in effect its own coordinate system. Every point in it has its own address built into it and all its interactions with other points are conducted in terms of contacts at their intrinsic addresses, rather than their positions on some map. So as we use our devices to communicate with other people around the world, our principal interest is in our interaction with these people, with no particular concern for their position on the globe.

Back to top

6.3: The mathematical community

Because the network model is symmetrical with respect to complexity we can use it to move up and down from the traditional featureless god to the enormously complex world of daily experience. From an abstract point of view a community is a communication network. The human world is enmeshed in a web of networks beginning with our relationships to ourselves and expanding through individual personal relationships to the huge array of long distance networks established by travel, postal services, telecommunications and the endless audiovisual channels offered by the internet.

We are setting out to match a mathematical model of god to the world of our experience to see if they correspond. My draft of such a model is an honours thesis written in 2019. Jeffrey Nicholls: Prolegomenon to Scientific Theology

The mathematical relationships from which such models are constructed are a product of the mathematical community. We can read the recorded history of this community stretching back at least 5000 years and see its traces in older artefacts. George Gheverghese Joseph: The Crest of the Peacock: Non-European Roots of Mathematics

Mathematics is a cumulative endeavour, each new discovery or invention building on what is already known. Its formal structures, if correctly proven, do not decay and need no maintenance, apart from copying texts that deteriorate and adding new inventions which expand and reinforce the old.

The mathematical community is a layer in the human network whose physical representation comprises all the people involved. In this network, individual mathematicians are the sources and the formal and informal communications between them are the links which bind them into their community.

The formal representation of mathematical proofs and other statements exist outside time. Encoded in an eternal medium or copied before each physical representation fails, they are effectively eternal. The Platonic model of god that we have inherited from antiquity also exists outside time, and is considered to be eternal. Paul Helm: Eternity (Stanford Encyclopedia of Philosophy)

Throughout recorded history, mathematics as been an important tool for understanding our world beginning with arithmetic, accounting for discrete objects like coins and sheep, and extending to the measurement of continuous quantities like land and fabric which inspired geometry. The relationship between physics and mathematics was sealed in Galileo's time when he claimed that mathematics was the language of the universe. Newton took a great step forward when he invented calculus to describe the solar system. Gauss and Riemann extended calculus to define differentiable manifolds which became the mathematical foundation for Einstein's general theory of relativity and the continuous groups of modern fundamental physics. This theory remains our picture of the large scale structure of the universe. Lie Group - Wikipedia

Newtonian physics ruled the world until the middle of the nineteenth century when the relationship between electricity and magnetism opened up a whole new field of study which lay outside Newtonian dynamics. Maxwell showed, however, that it is well within the range of calculus. Maxwell's differential equations opened up a new world of physics by explaining that light is a form of electromagnetic radiation. Around the same time spectroscopists were discovering the close relationships between light and matter and laying the foundations for quantum theory, which has placed new demands on mathematics and provided a rich field of new inspiration.

Quantum theory is now more than a century old and continues to raise more mathematical questions than it answers. The Clay Mathematics Institute is currently offering a million dollar prize to anybody who can make clear mathematical sense of quantum field theory. While the theory has delivered spectacular progress in our understanding the foundations of the world, it still relies on some magical thinking to get results because we cannot really see what is happening down there. A century of ever more powerful particle accelerators and astronomical instruments have delivered mountains of data but theory is struggling to catch up. In particular, there is a big mathematical gap between quantum mechanics and general relativity. Carlson, Jaffe & Wiles: The Millennium Prize Problems, problem 7.

Here I wish to draw an analogy between the mathematical community and the current state of quantum theory, based on the idea that the network model has a foot in both camps. The difficulty we face is outlined in 6.1 above. I am seeking insight by looking at the mathematical community through the lens of quantum field theory. The people in the community play the roles of structural particles, the fermions. The space-time web of communications between the players runs on bosons, the messengers. The field of fermions and bosons binds the community into a functioning whole. By being part of our own communities, we may get some feeling for how the system works.

The mathematical community is a dynamic entity, existing within time, whose communications are fixed points like theorems and discussions of theorems. We imagine that all human communities behave in an analogous manner. While mathematical theorems, if correctly proven, are considered to remain true forever, the communications we exchange in everyday life may be said to be fixed points for as long as they last, momentarily eternal. Our dynamic universe seems to be much bigger than the ancient immutable god since it comprises a long evolving sequence of eternal spacelike slices, each as big as the universe so long as it lasts.

The Platonic view of mathematics is that its theorems have some sort of independent existence outside humanity so that a mathematician producing a new proof is not so much creating something new as discovering something previously unknown. The situation is analogous to an explorer coming across a landscape that has never been seen before by human eyes, even though it may have existed in place for thousands or millions of years.

In the print oriented human world one must publish a discovery in some permanent and widely respected form to gain credit. There is work for historians of science to decide priority if an idea is published by many people at about the same time. No one who had the idea and did nothing about it enters into the discussion. A vast treasury of mathematics may exist in the Platonic heaven, but we know nothing of it until it is published. Events may exist in the mind of god, but we cannot know about them until they happen. David J. Gross: Nobel lecture: The Discovery of Asymptotic Freedom and the Emergence of QCD

There is considerable difficulty, as we discussed above, reconciling the eternity of god with the life of god, since life to us means motion. Here in the divine Universe, we understand all observables to be fixed points in the divine dynamics. We cannot see the dynamics, only the fixed points. In the spirit of science, therefore, we can say nothing about the invisible dynamics except what we can infer from what we observe.

All the communications with the community appear as physical representations of information, in speech, writing, images and so on. But there is also much more information represented in the minds of mathematicians, which we know to be physically represented by their bodies, particularly in their central nervous systems, and these dynamical states are open to being represented in suitably large Hilbert spaces.

Quantum field theory is built around the idea of gauge theory which assumes that in the world of fields there is a lot going on that we cannot see. All that we have to go on are the things that we can see, so that we assume that many details of what is happening in the invisible world must cancel out or be irrelevant so as to leave us with what we do see. This idea is called gauge symmetry. A mathematically perfect wheel is symmetrical: we cannot tell if it is spinning or not until we break its symmetry by putting a mark on it. In the mathematical community we may say that the observable output is the published theorems. All the flow of education, exploration and discussion that leads to the theorems is from this point of view invisible, the symmetry which is eventually marked by the emergence of a theorem. We see a similar phenomenon in our parliaments. Their practical output is legislation. Behind the legislation is an invisible mountain of talk and inquiry that feeds the construction of legislation.

I easy to see where all the talk comes from in our communities. Where does the analogous invisible background come from in physics? It arises from our need for frames of reference to measure and construct things. A builder is presented with a set of plans, drawn by an architect, and a piece of land. The first step is to use the dimensions on the plan to drive in some pegs to outline the foundations. This is pretty easy. The land does not move and the measurements can be laid out using a tape and a bit of geometry. An astronomer trying to plot the motions of the solar system has to take into account all the motions of the planets and moons in three dimensional space. Not so easy, but after centuries of work astronomers have devised a number of reference systems. The equatorial system, for instance, is fixed by the direction of the Earth's poles and the intersection of a projection of the equator with the Earth's orbit at the March equinox. Different systems can be interchanged by appropriate geometric calculations. These coordinates, measured to high precision, are used for astronomical and cosmological work. Celestial coordinate system - Wikipedia

The establishment of coordinate systems for quantum mechanics is more difficult. It goes beyond geometry into establishing relationships between the infinite number of states occupied by an enormous variety of microscopic particles. These particles range in size from large molecules like nucleic acids and proteins to fundamental particles like electrons, photons, quarks and gluons. These states are represented by state vectors, |φ> in Hilbert space, a linear complex vector space with any number of dimensions. John von Neumann The Mathematical Foundations of Quantum Mechanics

The most interesting feature of quantum mechanics is superposition. Since quantum theory, like the theory of computer networks, is indifferent to the complexity of the system being studied, superposition can be illustrated using a two dimensional Hilbert space with basis vectors |0> and |1>. We can form a state vector by adding them together to form a superposition called a |qubit>: |qubit> = a|0> + b|1>, where a and b are complex numbers such that |a|2 + |b|2 = 1. This superposition is not directly observable. When |qubit> is observed, we see the value corresponding to |0> with probability |a|2, which is a real number, and the value corresponding to |1> with probability |b|2. Why this works is known as the quantum measurement problem. It appears that a lot of the information encoded in |qubit> is lost in the measurement process.

It is difficult to decide if this lost information really exists in the first place. Given that we can (theoretically) make an infinity of different superpositions in a two dimensional Hilbert space, we must wonder where all this information stored. Do the fields proposed by field theory really exist, or is all the information stored in the particles?

The "mathematical field" in the mathematical community exists in the minds of the mathematicians and their communications. How do we model this field as a local gauge symmetry? One point to note is that given all the human languages that may be involved each core mathematical idea may have a large space of expressions which can be translated or transformed into one other. Here I will assume, by analogy with the mathematical community, that the information attributed to fields is stored in or represented by particles. I presume that an electron, like a mathematician, has a personality that guides its interaction with other particles.

Back to top

6.4: Divinity to trinity

We now turn to our deepest physical foundations, the lower physical layers in the universal network. We have previously noticed that we can imagine two structureless sources of the Universe: the traditional Christian God and the initial singularity predicted by general relativity. Gravitational singularity - Wikipedia

The line of thought developed in this essay started in 1965 when I read Bernard Lonergan's claim that the existence of 'empirical residue' shows that the Universe is not god. This did not seem right to me. I felt that every datum, that is every event, has a history stretching back to the beginning. God after all, is not an event, it just is. Nevertheless we understand fixed point theory to predict that a god will embody fixed points that are related to one another and given meaning by the underlying dynamics. Lonergan: Insight: A Study of Human Understanding,

In chapter 1 I mentioned that way back in my monastic days I guessed that the theory of the Trinity might provide a means to link the absolute simplicity of the traditional god to the unlimited complexity of the observable world. Fifty years later this still looks good. Now I see the Christian doctrine of the Trinity in the light of fixed point theory. The first steps in the differentiation of god were already taken for us long ago in the theory of the Trinity developed by Augustine and Aquinas and studied in modern times by Bernard Lonergan. They now merely require translation into the modern idiom and extended from a duality of two personae (Father and Son) to the third person of the Trinity (the Holy Spirit) and beyond to a transfinite number of persons (sources). Trinity - Wikipedia, Lonergan (2007): The Triune God: Systematics

The ancient theologians needed to find a way to reconciling the unity of God with the multiplicity of personalities suggested in the Bible. Aquinas, following Augustine dealt with this problem using a psychological model drawn from ancient Greek psychology. This idea first appears in John's suggestion that the second person is the Father's Image of Himself, not an abstract image as might occur in a human mind, but a concrete image, as real and divine as the Father. The third person of the Trinity, the Holy spirit, is understood as the realization of the love between Father and Son. The constitution of God was mapped onto ancient ideas of the human spiritual constitution, consistent with the idea that we are conscious beings, created in the image of god, capable of reflecting on ourselves and loving our reflection. Aquinas, Summa, I, 27, 1: Is there procession in God?, Aquinas, Summa, I, 34, 2: Is "Word" the Son's proper name?

Many churches consider the trinity to be a revealed truth, not open to either explanation or contradiction. What is important, from our point of view, is that the traditional god can be many. The fact that the observed Universe is extremely complex does not, therefore, immediately rule out the claim that the Universe is divine.

The Trinity is an example of the basic unit from which networks are constructed: two sources and a link between them. In quantum mechanical terms, two fermions and a boson. Networks expand by copying themselves, adding more and more sources and linking them together. The Australian project to build a National Broadband Network is an example of this process. We see it also everywhere is life, organisms growing by cells duplicating themselves and the daughters differentiating and remaining in close communication with one another.

We assume that the initial singularity shares the properties of the traditional god and is capable of reproduction and differentiation. If it was not, we would not be here. Although the psychological model of the Trinity provides a credible model as an aid to belief, there remains a problem. While all three persons are understood to be identically god they are also held to be really distinct. The contradiction involved in this picture is represented by the Scutum Trinitatis (Shield of the Trinity), a graphic representation of the Trinity. To avoid this contradiction, we introduce space. Logically, we define space as a situation where distinct objects, p and not-p, can exist simultaneously. Traditional theology claims that god is not a body and so does not occupy space. If we are to have many identical things, however, it seems that we need space to distinguish them. In 6.7 we add more detail to this idea. Fundamental particles are divided into two classes, bosons that are attracted to existence in the same place, and fermions that resist existing in the same place. Shield of the Trinity - Wikipedia

Back to top

6.5: Physical evolution I: Action to energy

We imagine that the fixed points emerge in the divine dynamics in an orderly sequence, beginning with very simple systems which gradually became more complex by a process of differentiation and evolution. This fits our observations of the history of the Universe. We begin, like Aristotle and Aquinas, with an absolutely simple god of pure action. In Christianity, the first bifurcation of the divinity is into Father and Son. Here we guess that the analogous bifurcation was the origin of time and energy.

This identification is suggested by classical dimensional analysis which sees physical parameters in terms of the dimensions mass, length and time, M, L and T. The first compound we construct is velocity, distance divided by time, which we write LT-1. In classical physics, kinetic energy is given by the formula KE = ½mv2. From this we conclude that the dimension of energy is ML2T-2. Dimensional analysis - Wikipedia

Action, in modern physics, is the time integral of energy which, at its simplest, means energy multiplied by time. The dimension of action is therefore ML2T-2 × T = ML2T-1. We might see time and energy are a duality created by the emergence of these two observable fixed points from pure action. One of the fundamental equations of quantum mechanics is E = hf where f frequency is the inverse of time. A third definition of energy is work, the product of force and distance. Force is mass × acceleration, so once again the dimension of energy become MLT-2 × L = ML2T-2.

In quantum theory time is measured as a count of actions, like the ticks of a clock. We see action as the ubiquitous, primordial and undifferentiated, an atomic unit which appears in the observed world as the quantum of action. In the model proposed here, every action is a representation of the initial singularity so we take the initial singularity to be the primordial act, consistent with the idea proposed by Aristotle and Aquinas that god is pure act, actus purus

Logically, an action is simply something that changes some p into some not-p. When I move my hand it goes from here to there, which is not-here. An action has no intrinsic physical size, so there is no problem identifying it as both divine and the smallest entity in the universe, measured by the quantum of action. This is consistent with Aquinas's idea that god is ubiquitous in the universe. Aristotle's definition of time: the number of motion according to before and after seems consistent with this idea. So we take the first step in the emergence of the current universe from the initial singularity to be the bifurcation of pure action into energy and time. Another way of saying this is that the symmetry we call action has broken to energy and time. In terms of our computer model, we now have the foundation of a real computer, a clock that synchronizes all the logical activity in the machine. Aquinas, Summa, I, 8, 1: Is God in all things?Aristotle: time, Planck constant - Wikipedia

Turing's mathematical computer, like all of mathematics, is formal and outside time. Turing's idea has evolved through many physical implementations to become the computers we now use. Apart from synchronization, the other role of the clock in a computer is to hide the physical dynamics behind the logical process. Given a machine in an physically stationary state, the clock emits signal which sets everything in motion, carrying the computation one step forward. After enough time has elapsed for the the physical processes to reach equilibrium, the clock emits a second signal which freezes a snapshot of the new state. Turing noted that a human computer goes through a similar routine. Given a written copy of a certain stage in a computation, the computer then works to carry the computation one step forward and writes down the result. They may then stop work and hand their result to the next shift, which has all the information necessary to carry the computation forward another step.

How do particles evolve and become more complex? Evolution requires memory to carry information from generation to generation. In living creatures this memory is provided by the nucleic acids DNA and RNA. This system dates from close to the origin of life. Genes for some of the fundamental metabolic processes, like the citric acid cycle that plays a key role in human energy metabolism, are found in the Archaea which evolved about three billion years ago. These genes have have been reproduced through trillions of generations to become part of our bodies. Citric acid cycle - Wikipedia, Archaea - Wikipedia

How did this structure come about? There is a lot of speculation, but no certainty because it happened abut three billion years ago and we have almost no evidence to go on except a few fossils and laboratory and computer simulations. Given the existence of gene based life, however, the process of evolution from singe celled organisms to the present is relatively transparent since early bacteria are still with us and open to study. Earliest known life forms - Wikipedia

Traditional western theology imagines that the universe was intelligently designed and powerfully created by an omniscient and omnipotent god. Here I identify the initial singularity as the starting point of the universe and identify it with the god of Aquinas because both share the attributes of existence, absolute simplicity and creation of the world. The absolute simplicity of the creator presents a cybernetic problem, insofar as the principle of requisite variety prohibits a simple system from controlling a complex one (5.5). On the other hand, it seems reasonable to attribute omnipotence to both the traditional god and the initial singularity since the only constraint on each of them is the internal consistency. Neither is subject to external constraint.

The scientific approach to physics and cosmology requires that we take the world as we find it and attempt to understand the processes that make it behave as it does. This project made a great leap forward in the twentieth century with the discovery of quantum mechanics and relativity, but has left us with many puzzles, some outlined in sections 6.1 - 6.3. This approach, common to all science, is an attempt to understand the past from evidence obtained in the present. Implicit in this approach is the scientific belief that our observations of the world and the mechanisms that lie behind these observations are consistent. If we do find inconsistencies, we may take this as evidence that we are on the wrong track and need to recheck our models and observations. Apparent inconsistency is the driving force of scientific progress.

Another approach that supports the idea of understanding the past through the empirical present is the anthropic cosmological principle. The idea here is that the Universe was deliberately constructed by a designing creator to allow for our evolution. This conclusion arises because some see evolutionary bottlenecks which require precise tuning of various physical parameters to arrive at conditions conducive to life. One of these concerns the creation of carbon itself. We understand that heavier elements are synthesized by fusion of lighter ones. It turns out that there is no way to make carbon except by the fusion of three helium nuclei. This seems at first sight a very improbable event, which is nevertheless made possible by a couple of coincidences which may have been designed in by a creator. Anthropic principle - Wikipedia, Barrow & Tipler: The Anthropic Cosmological Principle

The first of these is a resonance of beryllium-8 which increases the probability of fusion of two helium-4 nuclei. The second is the existence, predicted by Hoyle, of an excited state of carbon-12 which encourages the fusion of beryllium-8 and helium-4. The anthropic argument suggests that these resonances might have been designed in to the Universe to favour the evolution of carbon and ultimately the evolution of carbon based life forms. Triple-alpha process - Wikipedia

An alternative to working from the present, which includes our existence, to the past is to work from the past to the present. This approach is made possible by the fact, derived from general relativity, that the initial singularity has no structure and presumably zero entropy. The problem raised here is the same as we faced when dealing with the absolutely simple classical god, the principle of requisite variety. The initial singularity with entropy zero has no a priori power to constrain the universe. This is not only a difficulty, it is also an advantage. Since the initial singularity has no power to constrain the universe, we would expect the universe to span the fullness of possibility. This universe should therefore have the same power as the omnipotent classical god, which is also limited only by consistency. It may also be taken to mean that the initial singularity is a necessary being in that its essence is identical to its existence, its essence is to exist. Aquinas, Summa: I, 25, 3: Is God omnipotent?, Aquinas, Summa, I, 3, 4: Are essence and existence the same in God?

I am inclined to believe that evolution began at the beginning. We are proposing to describe the structure of the universe using the computer network described in chapter 5. The simplest operation a computer can perform is to do nothing. As long as the initial singularity does nothing, we can think of it as eternal. The first meaningful operation is the logical not, written not or ¬. In the world of binary binary logic ¬ ¬ p = p

If we think of a wave as the sequence up-down-up-down . . . and understand down = ¬ up, we may see a wave as a sequence of not operations, and interpret a quantum of action executing such a sequence as a particle emitting energy. At this point we imagine that the duration of each cycle is random, but that the process is nevertheless cyclic. We propose cyclic reproductive behaviour as a criterion for the permanence of structures and so for the group like behaviour of much of what we observe in nature. The emergence of energy is the first step in the evolution of the universe. At this point we have no space but two time division multiplexed states, which we have called up and down, more abstractly p and not-p, more concretely potential and kinetic energy.

Back to top

6.6: Physical evolution 2: energy to entropy

In the absence of an omniscient divinity to create the details of the world, we need another mechanism. Broadly, we understand the initial singularity to be a source of random action and propose that some of these sequences of action become stable structures by establishing recursive closure so that they are able to maintain their own existence. We see this already in the generation of energy by the logical operation not, since in the binary system, not-not-p brings us back to p. In the vastly more complex context of life species maintain their existence by continuously reproducing new individuals so that death is overcome by birth in a cyclic process closely analogous to a wave. We can add detail to this idea using the network model.

The conservation of energy is now recognised as a fundamental symmetry of the physical world. An important feature of communication networks is their layered structure. Each layer provides services to the layer above it so that it is in the interests of the upper layers to maintain the lower layers, since the higher layers cannot exist without their support. Peers at any level in a network communicate by sending a signal down through the network layers to the hardware layer which carries a physical representation of the information between them. The hardware signal is then translated up through the recipient network into form intelligible to the recipient peer. All communication must go through the hardware. We understand the initial singularity to be the ultimate hardware of the universe. All users of the universal network therefore share the energy of the universe, which we may understand to account for its conservation. Since the creation of energy is a random event shared by all agents in the universe the fact of conservation of energy does not imply any absolute value of the energy of the universe, opening the way for the notion that the net energy of the universe may be zero. Conservation of energy - Wikipedia, Zero-energy universe - Wikipedia

Symmetries in general are features of lower layers of the universal network which are broken by the higher layers. They are the unchanging features of the moving world first imagined by Parmenides. Logically, we may think of a symmetry as an algorithm which remains constant as it is instantiated for different tasks by its users.

The conservation of energy is now known as the first law of thermodynamics. The second law of thermodynamics tells that the universe has a general tendency to increase its entropy, that is to increase its count of distinct states. More simply, it is creative and our principal task here is to understand how the creator manages to create itself..

Energy is easy to understand since it is simply the time rate of action expressed by the fundamental equation of mechanics E = hf. Entropy is rather more subtle but in a sense simpler, since it is merely a count, without any physical dimension. It is a dimensionless measure of creation. The physical concept entropy entered the world with the invention of steam engines. The French physicist Sadi Carnot first hinted at entropy in 1824 in his Reflections on the Motive Power of Fire. He understood that heat is a form of energy, and that the extraction of mechanical energy requires the passage of heat from a hot source to a cold source. He invented a reversible cycle, the Carnot cycle, to achieve this and was able to derive a formula which predicted the maximum proportion of heat energy that could be extracted as mechanical energy by running the cycle between two temperatures T1 (high) and T2 (low). This proportion, the efficiency η of the ideal heat engine is given by (T1 - T2) / T1. The Carnot cycle is reversible, so it can also model refrigeration, using mechanical energy to cool a hot source. Reflections on the Motive Power of Fire - Wikipedia

Further developments of Carnot's idea by Emile Clapeyron and Rudolf Clausius crystallized the notion of entropy. Entropy is the quantity that is conserved in the Carnot cycle: any reversible process must conserve entropy. The second law of thermodynamics states that entropy never decreases. Both the first and second laws apply to closed systems, which we take the Universe to be.

In classical thermodynamics, entropy is the inverse of temperature. The higher the temperature of a given quantity of heat, the lower its entropy. The assumption in the big bang model that the Universe began at a very high temperature suggests that its entropy was very low. The consequent fall in temperature at constant energy is equivalent to an increase in entropy, consistent with the second law. One of the difficulties with the usual understanding of the initial singularity and the big bang is that the universe started of as point of infinite energy and temperature, which is makes little sense physically.

The entropy concept gained a second life with the development of the mathematical theory of communication. As we have seen (5.4), Shannnon defined the entropy of a communication source as a function of the number and frequency of the letters of the source alphabet. Here entropy is interpreted as a measure of uncertainty. Information, which reduces uncertainty, is then measured as the reduction in uncertainty caused by the receipt of the information. The unit of uncertainty, the bit, is measured by the choice between yes and no. A game like twenty questions, which involves 20 yes/no answers, thus provides 20 bits of information, sufficient to decide between 220 = 1 048 576 possibilities.

The mathematical measure of entropy in communication theory is consistent with the thermodynamic measure. Carnot and his contemporaries did not know that the water and steam in their engines comprised vast numbers of molecules. Later, Ludwig Boltzmann began to study thermodynamics from a molecular perspective and realized that the entropy of a substance is a function of the number of its internal states. He found the relation S = k log W , where S is entropy, k is the Boltzmann constant which relates the count of microstates to metric units, and W is the number of internal states or 'complexions' of the system measured. This shows that reversibility is a a consequence of the cybernetic principle of requisite variety: number of states in = number of states out and vice versa. Boltzmann's entropy formula - Wikipedia

Entropy has had a bad press over the years since increase in entropy became associated with an increase in chaos. Ideal mechanical energy has zero entropy, so designers of heat engines saw entropy as opposed to thermal efficiency. Entropy as a measure of information, on the other hand, lies at the foundation of cybernetics and information theory.

We have already noticed that the cybernetic principle of requisite variety tells us that a complex system can only be controlled by a system of equal or greater entropy or complexity (5.5). Gregory Chaitin has shown that this principle is a consequence of Gödel’s incompleteness theorem:

Gödel's theorem may be demonstrated using arguments having an information-theoretic flavour. In such an approach it is possible to argue that if a theorem contains more information than a given set of axioms, then it is impossible for the theorem to be derived from the axioms. In contrast with the traditional proof based on the paradox of the liar, this new viewpoint suggests that the incompleteness phenomenon discovered by Gödel is natural and widespread rather than pathological and unusual. Gregory J. Chaitin: Gödel's Theorem and Information

Traditional theology maintains that the combination of the divine omniscience and omnipotence gives God the ability to completely know and control every detail of the future. Requisite variety invalidates this claim. Insofar as God is understood to be absolutely simple, its variety is zero, and so its powers of knowledge and control are absent. This raises the problem at the heart of this essay: How could the universe be the work of a divine intelligent designer when the divinity is so simple that it cannot control anything? We have already noted the first step to the answer: random uncontrolled processes can generate new states, and so may be seen as a source of entropy. At this point, however, we may imagine new states to be annihilated as fast as they are created so that although energy may be conserved, entropy remains ephemeral.

Back to top

6.7: Quantum theory and the creation of space-time

We are trying to put together a universe that looks like a gigantic computer network, starting from the basic functions of Boolean algebra, not and and. The foundation of any computer system is its operating system. The basic roles of the operating system are to manage communication and memory. Here we set out to conceive space-time as the operating system of the universe, the representative domain of the transfinite computer network.

Much thought in physics has gone into two questions: how do we quantize gravitation; and what really happens when we make a quantum measurement. Both questions relate to the interface between the quantum and classical worlds and both may be the wrong questions. In Einstein's Mistakes, Steven Weinberg writes:

The Copenhagen interpretation describes what happens when an observer makes a measurement but the observer and the act of measurement are treated classically. This is surely wrong: Physicists and the apparatus must be governed by the same quantum mechanical rules that govern everything else in the universe. But these rules are expressed in terms of a wave function (or more precisely a state vector) that evolves in a perfectly deterministic way. So where do the probabilistic rules of the Copenhagen interpretation come from? . . . It is enough to say that neither Bohr nor Einstein had focussed on the real problem of quantum mechanics. The Copenhagen rules clearly work, so they must be accepted. But this leaves the task of explaining them by applying the deterministic equation for the evolution of the wave function, the Schrödinger equation, to observers and their apparatus.' Steven Weinberg: Einstein's Mistakes

Perhaps the distinction between the quantum and classical worlds is a furphy?

Going a bit further, perhaps gravitation does not need quantization because it is already what we might call naked quantum mechanics a hybrid of quantum of classical physics. Most of the time we are looking at the outsides of particles, electrons or people, and trying to work out what is going on inside. In the case of gravitation, however, we are inside the particle, the universe, in the midst of the process that shapes the universe.

Energy as we have explained it is a wave, the time division multiplexing of action. If we are to have universe of zero energy from the beginning, we may intepreting this wave as a harmonic oscillator like a pendulum, cycling between kinetic and potential energy. If we compute the action of a formally perfect pendulum over a long period of time using the Lagrangian we find:

H = (KE - PE) dt = 0

since its potential energy energy is equal to its kinetic energy. The potential and kinetic energy of a pendulum are equal because gravitation is a conservative field: the kinetic energy of a freely falling particle increases at exactly the same rate as its potential energy decreases so the process is (energetically) reversible and the entropy of the system remains (theoretically) constant over time.

An isolated quantum process, going its own way at constant energy without any outside influence is also reversible. Mathematically this is guaranteed by the wave equation which preserves unitarity. We may interpret this in terms of communication to be the the work of a reversible codec (6.1) transforming the state vector. Mathematically this codec is represented in quantum mechanics as a unitary operator, so we are led to suspect the operation of unitary operators in the gravitational transformation of free fall.

A consequence of unitarity is that a normalized quantum system maintains its normalization through time and if identically prepared systems are observed often enough it will be found that the sum of the probabilities of the outcomes is 1, as predicted by the Born rule. Mathematically, this is a consequence of normalization and shows that the possible outcomes of observation form a complete system of independent events, a property shared with a communication source described by the mathematical theory of communication,

Newton's theory of gravitation has one glaring weak spot, the assumption of action at a distance. It seemed to fly in the face of the common sense view that action requires contact. Einstein solved the problem by producing as field theory of gravitation, and all the other departments of physics have followed suit.

The layered network model identifies symmetries in the universe as the footprints of lower layers which have been applied by higher layers for their own purposes, just as the simple algorithms of arithmetic are instantiated by their users in every application. We might use this model to explain the role of the velocity of light in the emergence of Minkowski space-time as an application instantiated from the underlying quantum mechanics of pure energy. The reason for this is that the null geodesics made possible by the Minkowski metric enable systems to maintain contact (and therefore causality) over all but space-like intervals. Before space emerged, all processes were naturally in contact. This hypothesis gives us some insight into the both the quantization of gravitation and the events we call quantum observation or measurement.

The heart of Newtonian physics is to be found in the differential equation F = ma, the classical coupling between force, mass and acceleration. The equivalent in basic quantum mechanics is the equation E = hf the logical coupling between energy and processing frequency via the quantum of action. Quantum mechanics and gravitation are the fundamental symmetries of a layer in the universal network that sees only only energy. They are applied by the layer above them to create the spacetime which serves as the operating system of the universe, managing memory and communication.

We understand a particle as a quantum of action embodying a logical process. While not is the simplest such process, we imagine that particles may embody the logical equivalent of any halting turing machine. Larger particles (like myself) may embody layered networks of such machines. We understand all the operations in a local computer to be synchronized by one timing signal. The machines in a network, on the other hand, may be spatially separated and run on independent clocks, introducing a probabilistic element into their interaction. Time-division multiplexing - Wikipedia

I may think of myself as a complex web of network processes which embody every physiological detail of my life down to the level of fundamental particles. I am a particle whose mass is approximately 100 kg occupying about 100 litres of space. From my mass and lifetime I can calculate that the overall human-size quantum action which constitutes my life comprises some 1060 Planck-size quanta executed over 100 years at the rate of about 1050 quanta per second blending seamlessly to make me act as I do. Since the universe may last forever, we cannot calculate its lifetime action, but a plausible mass might be 1052 times greater than mine, so that it is executing about something like 10100 Plank quanta per second. This we might interpret as the rate of interaction of its population of fundamental particles. Thinking along these lines, we may consider the universe itself as a particle containing a web of network processes. Observable Universe - Wikipedia

Quantum mechanics is a hypothetical mathematical description of the processes driving the four dimensional space-time in which we live our lives. These processes are not visible and we understand them be in perpetual motion as time goes by. The description of time and energy given in 6.4 above is not continuous but sees both time and energy progressing in discrete step one quantum of action at a time. If we put f = 1 in the equation E = hf we see that the quantum of action is both a unit of energy and the unit of time, as we expect when the symmetry underlying the energy and time is action. This is consistent with my feeling that continuous quantities have only a platonic existence and the minimum physical representation of information is the quantum of action.

An important mathematical feature of quantum theory is that it is indifferent to the complexity of the situation which it describes. The complexity of any quantum situation may be measured by the number of dimensions of the Hilbert space in which it is modelled. All the features of quantum mechanics are present in a two state system in a two dimensional Hilbert space. Quantum mechanics can carry us from the simplest two state systems to the mathematical limit for the description of consistent systems, that is to the boundaries to logical certainty established by Gödel and Turing.

The mathematical structures described by the quantum theory are symbolically represented in the literature but the corresponding physical entities are invisible to us. In this the theory differs completely from classical mechanics which is about the motions of clearly visible objects like planets, pistons, gears and pool balls. This difference is the source of the "measurement problem" which has been a bugbear of quantum theory since the beginning. Measurement problem - Wikipedia

This invisibility is not surprising. We communicate with one another in the classical world by body language, which includes such physical things as the sounds of speech, facial expressions, gentle and loving touch and physical violence. We cannot observe the hidden processes that lie behind this language, although, from our experience of ourselves, we can form a pretty good idea of what people are thinking when they act in certain ways. We know that the psychological processes behind our behaviour are the result of physical processes in our nervous systems, and that these hidden processes are often a lot more complex than the physical result. We say that a wink is as good as a nod, but there are circumstances like auctions and romances where hours of thought may go into the decision to wink and or nod, let alone say something.

The same thing happens with computing machinery. This computer chugs along at a billion or so logical operations per second. Mostly it is just marking time, waiting for me to hit a key. Then it sets to work, invisibly, to process the keystroke, which may simply mean write a comma on the screen. Having done this it stops work and begins marking time again unless there is some housekeeping to do. In the computer theoretical world, finishing a task is called halting. The computer reveals the result of its work, by printing a comma on the screen. Turing invented his machine to solve Hilbert's decision problem and showed that there are problems that require a computer to make an impossible decision so it can never halt.

Another practical reason for the invisibility of processes is that communication is itself a logical process. If a machine (person) is fully occupied in some process they may not have the resources to explain what they are doing. Because we are large and complex organisms, we can do two things at once but down in the simple foundations of the universe, it is necessary for a process to stop what it is doing to communicate its results.

To get results from quantum mechanics the invisible substratum of probability amplitudes must be 'measured' or 'observed'. Although it is often said that observation involves the interaction of a classical system with a quantum system, we know that all systems are quantum systems, so observation involves physically embodied quantum systems, ie particles, interacting with one another to yield observable events. As noted above, we place no limits on the size of a particle.

The quantum theory presents information through two channels, one mediated by the Born rule and the other by the eigenvalue equation. The Born Rule predicts the probability of events whose formal nature is described by the eigenvalue equation. The eigenvalue equation picks out fixed points in quantum process which become visible because they are fixed. Born rule - Wikipedia

In section 5.6 above we summarzized Feynman's succinct description of the quantum process. We may interpret computation of a quantum probability as the extraction of a classical fixed point out of the dynamics of an amplitude.

The probability of an event in an ideal experiment is given by the square of the absolute value of a complex number φ which is called the probability amplitude:

P = probability;
φ = probability amplitude;
P =|φ|2

When an event can occur in several alternative ways, the probability amplitude for the event is the sum of the probability amplitudes for each way considered separately. There is interference:

φ = φ1 + φ2;
P = |φ1 + φ2|2

If an experiment is performed which is capable of determining whether one or another alternative is actually taken, the probability of the event is the sum of the probabilities for each alternative. The interference is lost:

P = P1 + P2

He concludes:

One might still like to ask: “How does it work? What is the machinery behind the law?” No one has found any machinery behind the law. No one can “explain” any more than we have just “explained.” No one will give you any deeper representation of the situation. We have no ideas about a more basic mechanism from which these results can be deduced. Feynman, Leighton & Sands FLP III:01: Chapter 1: Quantum behaviour

We represent an amplitude by a complex number which we write φ = x + iy. We imagine this was a wave or a wheel spinning in one direction. We calculate the probability by the absolute square of a complex number which we obtain by multiplying the number with its complex conjugate, which we write φ* = x = iy, which we imagine as a wave or wheel going in the opposite direction. The multiplication yields a real number P = |φ|2 = (x + iy) × (x - iy) = x2 + y2. As Feynman notes, we cannot explain why this works, but it does, and it may be another example of the phenomenon observed by Wigner that mathematics is unreasonably effective in the physical sciences. Eugene Wigner: The Unreasonable Effectiveness of Mathematics in the Natural Sciences

The second source of information in quantum theory is the eigenvalue equation which picks out fixed points in the quantum dynamics. Quantum information is encoded in the phase or angle of vectors in Hilbert space. The Born rule uses a metric in Hilbert space, the "inner product" to compute the amplitude, that is the "distance" (in phase) between two states which measures the probability of transition between them. The eigenvalue equation identifies the operations on state vectors which do not change their phase. These state vectors are the ones we see when two states interact. Zurek points out that this selection is necessary for the transfer of information between the particles represented by the state vectors. Wojciech Hubert Zurek: Quantum origin of quantum jumps: breaking of unitary symmetry induced by information transfer and the transition from quantum to classical

All the known fundamental particles fall into two classes fermions and bosons. Fermions play the role of communication sources and their principal property is that any two of them strongly resist occupying the same state in spacetime, the Pauli exclusion principle. They are therefore the quantum mechanical foundation for spatial extension, implementing the logical definition of space in 6.4. Bosons play the role of messengers, carrying information between fermions. They do not obey the exclusion principle. The difference between fermions and bosons is established by the Spin statistics theorem. The proof of this theorem in quantum field theory depends on the commutation relations fields mapped onto Minkowski metric of space-time. It may be, however, if quantum theory represents a network layer beneath spacetime, that the bifurcation of commutation relations accounts for the Minkowski metric rather than vice versa. Streater & Wightman: PCT, Spin, Statistics and All That pp 146 sqq, Spin-statistics theorem - Wikipedia

Back to top

6.8: Measurement, creation and insight

In the absence of the traditional creative divinity, we assume that the universe creates itself, that is it makes itself more complex, or increases its entropy. The contact of two quantum systems brings a new observable entity into the world, and possibly destroys others in the process. This is the normal procedure in the accelerator experiments that provide us with much of the information we have about fundamental physics. This question is: does this and other quantum interactions increase entropy?

Feynman's lecture highlights one of the most interesting features of quantum mechanics: some feel that when we look at quantum systems we change their behaviour. An isolated quantum system is believed to evolve deterministically as described by the Schrödinger equation (6.2). This equation has a number of solutions corresponding to the the number of dimensions of the Hilbert space to which the equation is applied. All these solutions are considered to exist simultaneously, added together to give a state of "superposition" analogous to the superposition of overtones in a musical note. When we observe the system, however, we see only one of these solutions, and if we quickly observe it again, we see the same state again. This state of affairs is called "the collapse of the wave function". This collapse seems quite likely to be an artefact of the mathematical theory. It is not clear that the solutions of the wave equation are all physically represented in the first place.

Many discussions of quantum measurement trace its peculiar behaviour to the presence of a conscious physicist doing the measuring. Here we take the view that the universe is continually observing itself through every quantum interaction of physical particles. The increase in entropy arising from quantum interaction suggests a close relationship between communication and creativity. This is consistent with everyday experience. We learn by meeting one another.

Why does the world create itself? The formal answers may lie in a combination of fixed point theory, which suggests that a closed dynamic system has fixed points, and Cantor's theorem, which suggests that it would be inconsistent for the world not to complexify. From Cantor's construction of the transfinite numbers, we guess that the key to complexification is combination and permutation. The general idea seems to be an evolutionary process. A system of pure activity will be inclined to try everything (variation) and those variations which involve local inconsistencies will be weeded out (selection).

In the context of the transfinite network, we understand the act of insight to be scale invariant, working at every level from quantum mechanics to individual human understandings and beyond that to the understanding of communities, nations and the whole human noosphere (6.3). Insight, thus understood, is the source of fixed points in the divine dynamics. In the theology of the Trinity, we may understand the procession of the Word, a new fixed point in the divinity, as an act of insight.

The paradigmatic act of insight invoked by Bernard Lonergan in his eponymous book is Archimedes' realization (in his bath) that the upward force on a submerged body is equal to the weight of fluid displaced. We all experience the act of insight: the often sudden moment of 'seeing' or understanding something. We recognise and observe such isolated acts because they are isolated. Here we understand insight in terms of communication as the act of decoding a message. In normal conversation this action is effectively instantaneous, and we rarely have to stop and think before we can work out what the person speaking to us means. On the other hand, some things can take a long time to understand. One may stare at a chessboard for hours and still not see what to do next. People studied motion for millennia before Einstein arrived at the theory of relativity.

We may understand insight as the conscious recognition that we have found a consistent path through a body of data which amounts, in surveying terms, to a closure. A surveyor may be confident in their observations and calculations if after having worked their way around a circuit back to their starting point they find that their measured and calculated distance from the starting point is close to zero.

A system which moves from a certain starting point through a sequence of operations back to its starting point is cyclic, like a wave. We have already suggested that the logical not operation provides a mechanism for the emergence of energy from action (6.5), the result being a wave each cycle of which corresponds to a quantum of action. The length of this cycle is inversely proportional to the energy involved. This is the simplest possible cycle. Another more complex cycle is the algorithm of reproduction through which living species tend towards eternal life. Saltwater crocodiles reach sexual maturity at about 15 years and their species diverged from their ancestor about 10 million years ago, so the "crocodile wave" may have executed about 600 thousand cycles. Modern humans, on the other hand, emerged about 300 000 years ago, so we are only about 20 thousand generations old.

Energy is conserved because it is too simple to die. Structures made of energy, like crocodiles can die due to their complexity but their power of reproduction greatly extends the life of the species beyond the live of individuals. From an Aristotelian point of view, we might consider energy as the matter from which all else is constructed, energy being able to take a huge variety of different forms.

Quantum mechanics does not distinguish between potential and kinetic energy. Both enter its equations on an equal footing, and all that counts is frequency differences and the fixed points or nodes that are revealed when these frequencies are superposed. Quantum superposition - Wikipedia

This symmetry in quantum mechanics suggests that it is blind to space and reveals to us only the nodes of the cosmic harmony, which we represent by the eigenvectors and corresponding eigenvalues of quantum transformations. Energy in itself is also blind to the distinction between space and time, and so we see basic quantum mechanics as the study of harmony in an infinite dimensional complex 'frequency space' or 'energy space' which exists without reference to the three dimensional space in which we move. Nevertheless it is the symmetry whose "insight" is the creation of space-time. Space provides a memory for the increase in entropy "desired" by the creative universe. Eigenvalues and eigenvectors - Wikipedia

Back to top

6.9: Gravitation: the structure of space-time

The overall structure of the universe is described by general relativity. We can imagine that this structure was one of the first things to form as the universe expanded. We notice that the modern mathematics of general relativity and fundamental particles (described by variations of Yang-Mills theory) is quite similar, Lie groups. Gerardus 't Hooft: 50 Years of Yang Mills Theory

As we noted in chapter 1, the initial singularity enjoys the same three features as the classical god: it exists, it is the source of the universe, and it has no structure. It is, in Aquinas' words, omnino simplex. Hawking and Ellis imagine expansion of the initial singularity to the existing Universe as a time reversed version of the contraction of a black hole. Studies of black holes suggest that although they eventually 'evaporate' for quantum mechanical reasons, the time taken to do so for any reasonable sized black hole is enormous. How, then, can we explain the rapid expansion of the initial singularity implied in the name 'big bang'? Black hole - Wikipedia

The 'big bang' is not an explosion in the ordinary sense of the word, where exploding gases expand rapidly into an already existing space. The big bang describes the increase in size of space-time itself, viewed from the inside. If the Universe is all that there is, there is nothing outside it to expand into. Hard to visualise, but logically consistent! From our point of view, we know that the of space-time is expanding because the distances to distant galaxies are increasing relative to the size of atoms. The size of atoms is determined by the fundamental physical constants such as the mass and charge of electrons and the quantum of action. Atomic radius - Wikipedia

As Newton realized, gravitation is universal, it affects everything in the Universe without exception. Gravitation is not a force in space, like other physical forces, but a property of space itself. We might understand it to be a the property of the formless initial singularity that led it to differentiate itself into the Universe as we know it. Gravitation needs none of the outside help implied in the notion that the Universe is created by a god outside it. How are we to understand this?

Physicists, like surveyors, astronomers and builders use reference frames to provide a basis for measurement. Such frames are artificial, however, even though they may be based on fixed points in nature. These may range from pegs driven into the ground to the intersection of the plane of the Earth's equator with the plane of its orbit around the Sun. Because they are artificial, changing frames of reference should not change the measured reality. The algorithms for mapping from one frame of reference to another must leave reality unchanged. First Point of Aries - Wikipedia

Einstein's special theory of relativity is based on the idea that no matter how we are moving when we measure it the velocity of light is constant. To achieve this the transformations between different frames of reference moving at constant velocity ("inertial frames") must follow the Lorentz transformation. Lorentz transformation - Wikipedia

General relativity deals with accelerated motion. The velocity of light is no longer constant and the Lorentz transformation is no longer sufficient to transform away the effects of accelerated frames of reference. Einstein found that he could replace the invariance of the velocity of light with a new standard for reality, the metric which measures the spacetime distance between events. The general theory is a transformation designed so that no matter how we devise coordinates for measuring processes in the universe, distances between events remain unchanged, as they should.

Gravitation, currently falls outside the range of quantum theory. This may suggest that it is embodied in the first generation of fixed points in the dynamics of god. Gravitation conserves entropy, so it behaves very much like the unitary evolution of an unobserved quantum system. Since gravitation sees only energy the exact form of energy is irrelevant to it. Richard Feynman has suggested that the total energy of the Universe could be zero, the gravitational potential being exactly matched by the energy of the particles that fill the universe. Richard Feynman (2002): Feynman Lectures on Gravitation

When asked why it took so long to develop the general theory of relativity, Einstein replied:

The main reason lies in the fact that it is not easy to free oneself from the idea that coordinates must have an immediate metrical meaning.' Sunny Auyang: How Is Quantum Field Theory Possible? (link above)

Here we rely on the fact that an act or event has no metric. It is simply a change of state. In a divine Universe, an act is an act of god. Although we imagine god as the whole of reality and the physical quantum of action is exceedingly small, because they have no metric they can both be identified simply as pure action. A quantum event is one quantum of action, as is the procession of the Son from the Father

Languages are distinguished by the algorithms used to encode and decode them. Gravitation, being a universal language, spoken by every particle in the Universe, would appear to need no such algorithm. Indeed, because it appears to have been operative in the structureless initial singularity, we might consider it to be a null language, that is a string of meaningless identical symbols or acts. This is consistent with the idea that energy is simply a time division multiplexed string of quanta of action.

The mathematical starting point for Einstein's general theory is a differentiable manifold. We might imagine this manifold as little pieces of flat Euclidean spaces, rather like the units of chain mail, hinged together by flexible and elastic joints represented by differentiation. This structure enables us to model the curved and dynamic space which encloses the detailed structure of our Universe. There is no measure of distance in the manifold itself, so the same manifold can represent the whole of space-time, no matter how big or small. It is consistent both with expanding and contracting space-time and with fundamental particles. When Einstein applied this mathematical structure to modelling the Universe he introduced a metric which determines the interval between different points in the Universe. This metric depends on the energy present in each neighbourhood. This scheme of transformation, represented by Einstein's field equation, helps us to understand the large scale structure of the whole universe. Einstein field equations - Wikipedia, Differentiable manifold - Wikipedia

We can make this abstract mathematical model more concrete by imagining each unit of inertial space as a particle with fermionic properties and the relationships of the particles as the messages carried between them by bosons. The energy at any point in this structure is the rate of communication between the particles. The energy measures the 'curvature' of local sections of the network which appears in the formal treatment as the metric of the network space. Because we do not want these communications to interfere with one another, we require the structure to be three dimensional, so that connections can be made without interference.

The Earth, from a relativistic point of view, is a region of high energy or intense communication. It curves the space around itself so we do not fall off. We can see this curvature clearly in the motion of satellites, which are continually falling weightlessly, but nevertheless go round the Earth, following an inertial geodesic. Gravitation becomes perceptible because our normal geodesic flow is interrupted by the presence of the Earth.

Astronomy is one of the oldest sciences, very important for human life. This is indicated by many ancient monuments aligned to celestial phenomena such as the summer and winter solstices and equinoxes. Archaeoastronomy - Wikipedia

Since the maturation, mating, and breeding and of plants and animals are closely related to the annual cycle of the seasons, there is much to be gained from understanding the astronomical sources of this cycle. The heavens also provide us with a frame of reference for navigation. This has evolved from simple visual observations to the precise observations which are possible with current astronomical instruments. Celestial navigation is particularly important for people travelling in featureless environments like deserts and the sea. The notion that the heavens control events on Earth went beyond agriculture to astrological investigations into love and politics. Much early astronomical work was funded by rulers using astronomical observation to seek heavenly guidance. Astrology - Wikipedia

Ancient observers naturally placed the Earth at the centre of the world and imagined the heavens revolving around it. The modern development of astronomy began by turning this upside down. Now we see the heavens as stationary and the Earth as rotating. The next step forward came when it was realized that the Earth is not at the centre of the world, but orbits around the Sun. Once again, many people found this very hard to believe, but is now common knowledge.

Measurement of the solar system was well advanced by the time of Isaac Newton. Newton set out to discover why the planets move as they do, and discovered the law of universal gravitation. Massive bodies, like moons and planets, attract one another with a force proportional to the product of their masses divided by the square of the distance between them. Newton's work supported Galileo's view that mathematics is the natural language of the Universe. Isaac Newton - Wikipedia

Relativity was next big step in understanding our cosmic habitat. The general theory was published in 1916, and immediately solved one old astronomical problem, the precession of the perihelion of Mercury. Albert Einstein - Wikipedia, Jose Wudka: The precession of the perihelion of Mercury

An important development was the use of spectroscopy to identify the composition of the stars. In 1868 spectroscopists discovered the element helium in the Sun before it was discovered on Earth. At about the same time, spectroscopic measurements began to show that the light from distant galaxies was redshifted, meaning that they were moving away from the Earth. Using this information, Edwin Hubble formulated Hubble's law which correlated the redshift of a galaxy with its distance from Earth and suggested that the Universe was expanding. Helium - Wikipedia, Redshift - Wikipedia

Since Hubble's time the advent of computers has made detailed modelling of the large scale structure of the Universe feasible. Very large terrestrial telescopes and space based instruments have enabled us to look back to the early Universe was very young and much smaller than it is now. One of the most useful sources of information about the early Universe is the cosmic microwave background radiation, first discovered in 1964. Cosmic microwave background - Wikipedia, The Illustris Collaboration

The combination of observation and theory has led to widespread acceptance of the history of the universe known as big bang cosmology. This cosmology combines the relativistic model of the large scale structure of the Universe with the quantum field theory of its microscopic structure known as the standard model. Between them, these two models give us a relatively consistent picture of the evolution of the Universe from very shortly after its origin to hundreds of billions of years into the future. Big Bang - Wikipedia, Standard Model - Wikipedia, Peacock: Cosmological Physics

Back to top

6.10. Cybernetics, algorithms and selection: P & NP

The classical Christian god knows and controls every moment in the Universe. Aquinas explains that God has immediate providence over everything. This is possible because God is both omnipotent and omniscient. How this is possible, given that god is completely simple, is a mystery. This belief is nevertheless a foundation of Christian hope. If god is benevolent toward us, their infinite knowledge and power guarantees that everything is for the best, even though things often look very grim. Accepting this is seen to be a test of faith. Aquinas, Summa, I, 22, 3: Does God have immediate providence over everything?

Here we understand the observable Universe to be the fixed points in the divine dynamics. In the network picture, we understand these fixed points to be messages from god, that is divine revelation. We attempt to understand the relationships between these messages by modelling the underlying dynamics. Our principal tools for this work are quantum theory and communication networks.

Feynman was among the first to realize that there is a correspondence between quantum operators and logical operators. This correspondence is the foundation of quantum information theory, a new and vigorously growing field of research. Developers see two principal advantages in quantum communication. The first is security. The observation of a quantum state causes it to 'collapse' so that any attempt to intercept a message represented as a quantum state breaks the message, thus alerting the communicants to its interception. Nielsen & Chuang: Quantum Computation and Quantum Information

The second is computational power. Proponents of quantum computation believe that because quantum formalism is based on continuous functions in a complete Hilbert space, it can be interpreted as a perfect analogue computer capable of processing infinite superpositions of states simultaneously. Although this may be formally correct, the formal precision of the mathematical model is lost during the measuring process necessary to extract information from the quantum system. This means that from the point of view of actual results, a quantum computation system may have difficulty performing better than a classical turing machine.

The formal arguments for the determinism of continuous functions require limiting processes that move into the transfinite domain of real numbers. Can these formal arguments be realized? Here we take the view that the limits to mathematical determinism implied by Gödel and Turing's results may prevent real continuous analogue computations from being deterministic, so that the uncertainty manifested by quantum observations may also be present in the quantum dynamics that is the source of these observations.

Given our assumption that the Universe is divine, this constraint on determinism would imply that unlike the classical god with complete knowledge and control, the cosmic god is not completely deterministic. The consistency of the divine dynamics opens it to the indeterminism implicit in the results of Gödel and Turing. Here we assume that the turing machine marks the limit on computation. Although there are ℵ1 possible mappings of the natural numbers to themselves, there are only ℵ0 possible Turing computable functions, so most of the mappings of the set of natural numbers (or any equivalent set) are incomputable. This implies a large degree of uncertainty in observable processes, accounting for the uncertainty of the world. Extraordinary engineering and procedural precautions must be taken whenever we wish to establish deterministic systems like accident free air travel.

We get a more detailed insight into the relationship of determinism to uncertainty through cybernetic principle of requisite variety. Gregory Chaitin sees this principle as closely related to Gödel's work on incompleteness. From this point of view, no proof can reach a conclusion containing more information than the hypotheses from which the proof is drawn. This idea holds at all levels of complexity, so that we cannot draw a conclusion comprising n+1 bits of information from n bits of input. Gregory J. Chaitin: Gödel's Theorem and Information (link above)

Chaitin's interpretation of G&0uml;del's theorems shows us the necessary conditions for control in the Universe. This is in a sense a modernization of Aristotle's axiom that no potential can actualize itself, where we use Lonergan's understanding of potency and act: potency means intelligible, act means actually understood. In this case, the possible interpretations of a set of data are effectively infinite until we know the algorithm by which the data were encoded.

The aim of the mathematical theory of communication is to establish deterministic motion from past to future. Because entropy tends to increase, however, the future is generally more complex than the past and so cannot be controlled by it.

Evolution is built upon variation and selection. Variation is possible because some transformations are not computable or controllable. Selection picks out the variations that are consistent with survival and reproduction. Survival and reproduction require control, that is computability. This means we have the situation envisaged by the P – NP problem in the theory of computation. P versus NP problem - Wikipedia

Turing showed that some problems are incomputable, meaning that they cannot be solved by a deterministic process. This does not exclude the possibility that problems may be solved "accidentally" by random processes. This appears to be an important feature of creation. Insofar as creation means introducing structures that have never existed before, it is hard to imagine it being the conclusion of a deterministic logical process.

We imagine the complexification of the universe in terms of Cantor's idea for the generation of transfinite numbers by combining and permuting the natural numbers. The P – NP problem asks whether a solution found by whatever means can be verified by a computable process. Evolution by natural selection suggests that incomputable problems can be solved by random variations and the solutions can be verified by the computable processes of survival and reproduction.

The key to the evolution of living creatures lies in the permanence of the genome. There are many mechanisms in place to ensure the accurate copying of DNA and RNA. Even in creatures that evolve very quickly like viruses, the maximum rate of base changes in the genome seems to be small, about one base per million per generation. Without some sort of memory to preserve "progress so far" it is hard to see how more complex structures can be established by a recursive process of complexification. Rafael Sanjuán et al: Viral Mutation Rates

A solution may lie in the layered structure of computer networks. Although the transfinite model outlined in chapter 5 is based on Cantor's transfinite numbers, transfinity is formally a relative concept, so that we may see 2 as transfinite with respect to 1. If we consider the initial singularity, cardinal 1, as the first layer of the universal network, the second layer may have cardinal 2 and the third cardinal 4 and so on. If we think of computation in terms of clock cycles, the frequency of execution of processes in the higher layers decreases in proportion to their complexity, and they may be said to live longer and so act as memory to preserve the structures of the simpler systems upon which they rely for their existence.

The details of this process of evolution are not clear to me, but, given that the universe did start as a completely simple initial singularity and that it is now exceedingly complex, there must be some process based on the preservation of randomly constructed recursive structures to explain this complexification. The fact that we find group theory widely represented in the structure of the world suggests that grouplike structures, insofar as they are closed and self perpetuating are an important feature of this process.

Back to top

6.11: Representation: fundamental particles

So far we have created a universe of energy out of the initial singularity of action by the application of the logical not operator to make a wave of sequential actions. The existence of energy enables the existence of gravitation and quantum mechanics, but we have yet to see details which might account for the large scale structure of the Universe, and my guess is that we might find the source of this structure in quantum mechanics. We discussed quantum mechanics in an abstract form in chapter 5, but now we turn to making it real, which raises problems which fall under the general heading of representation.

Historically meaning, knowledge and information have been considered as rather immaterial aspects of the world, but here we agree with Landauer that information is a physical entity and the same goes for knowledge and meaning. To get some grip on the representation problem, we need to look at the philosophical step that marked the difference between Plato and Aristotle.

Many philosophers have tried to understand the nature of the Universe by studying the nature of knowledge. This present project started in the 1960s with Bernard's Lonergan's effort, in Insight, to put Thomistic metaphysics on a new more modern footing. He had to be careful what he said, and not go too close to the boundary of orthodoxy. It was necessary for him to devise a metaphysical model which respected the business plan of his church. This required him to classify knowledge into humanly intelligible proportionate knowledge accessible to science and inaccessible transcendent knowledge, the exclusive intellectual property of his church acquired by direct revelation from god. As required, this model puts the nature of god beyond human comprehension.

The Papacy responded to the "Modernist' Crisis" in the Church with their standard approach of condemning new ideas and reaffirming old ones. In 1864 Pope Pius IX produced the Syllabus of Errors, a condemnation of 80 statements which contradicted Catholic belief. On the positive side his successor, Pius X, advised the Italian schools of theology that the principles and major opinions of Thomas Aquinas should be held conscientiously. This led the schools to draft of a list of 24 Thomistic theses attempting to capture the essence of Thomism. These were approved by the Congregation of Studies as safe directive norms. As far as I know these remain in place and give some substance to the requirement in Canon Law that aspirants to the priesthood be trained in Thomistic theology. Pius X: On the doctrines of the modernists Pascendi dominici gregis, P. Lumbreras: The Twenty-Four Fundamental Theses of Official Catholic Philosophy: Part I, John Paul II (1983): Code of Canon Law: Canon 252: §3.

The key thesis, from our point of view, is number 23, which harks straight back to Plato who considered his immaterial forms the true subject of intellectual knowledge:

Intellectuality necessarily follows immateriality, and in such a manner that the degree of intellectuality is in proportion to the remoteness from matter. The adequate object of intellection is being as such; but the proper object of the human intellect, in the present state of union, is restricted to the essences abstracted from material conditions.

The official philosophy of the Catholic Church holds, in effect, that real reality is immaterial, and that the human soul, since it has intellectual powers, must also be immaterial. The problem here is that immaterial effectively means invisible, since our senses depend upon physical particles of one sort or another to gather information. God, therefore, must be invisible, which completely contradicts the premise upon which our scientific theology is built, that the god is identical to the universe and therefore scientifically observable.

The original answer to this problem comes from Aristotle's modification of Plato's theory of forms. From Plato's point of view, the forms are eternal and immutable. Aristotle accepted this but accommodated change by imagining that change involved the replacement of one form by another in the same matter, as we might, for instance, mould the bronze of a sword into a ploughshare. As a by-product, in effect, forms were represented in a physical guise that opened them to observation and study, so we could learn about horses by studying actual physical horses.

Since Plato's forms were invisible, he had to devise a different theory of knowledge. He imagined that the souls of the unborn once lived in the heaven of forms and came to know them. People were born full of this innate knowledge, even though they did not know this. Plato imagined a methodology which has become known as the Socratic Method for eliciting the knowledge that was believed to already be present in peoples' minds. Platonic epistemology - Wikipedia

Following Parmenides, Plato drew a sharp distinction between knowledge, which was based on the forms, and opinion which was derived from interacting with the world and its people. This idea eventually filtered into Gnosticism, where the goal was to achieve knowledge of the supreme divinity through some form of mystical or esoteric insight.

The Platonic influence is very strong in some approaches to quantum field theory. Some hold that the invisible fields, analogous to the Platonic ideas are the real reality and the particles that we actually observe are of lesser importance. I have already quoted the philosopher Auyang:

According to the current standard model of elementary particle physics based on quantum field theory the fundamental ontology of the world is a set of interacting fields.

Although this system works quite well it leads to problems of infinity and ludicrously large estimates for a number of measured physical parameters which fall under the general category of "cosmological constant problems".

Silvan Schweber summarizes the sources of the infinities:

These difficulties stem from the fact that (i) the description is in term of local fields (i.e., fields that are defined at a point in space-time), which are systems with an infinite number of degrees of freedom, and (ii) the interaction between fields is assumed to be local. . . .

In QED the local interaction terms imply that photons will couple with (virtual) electron-positron pairs of arbitrarily high momenta and result in divergences, and similarly electrons and positrons will couple with (virtual) photons of arbitrary high momenta and give rise to divergences. These problems impeded progress throughout the 1930s, and most of the workers in the field doubted the correctness of QFT in view of these divergence difficulties. Silvan S. Schweber: The sources of Schwinger's Green's functions

The infinity problems were ultimately removed or hidden in the late 40s by renormalization, which opened the way for a theory of quantum electrodynamics which has yielded theoretical results indistinguishable for the best available measurement. It seems unlikely that this is the last word, however, since the theory continues to yield ludicrous results. Frank Wilczek, one of the Nobel prizewinners for the development of quantum chromodynamics sings the praises of the new theory in his book on the subject, but has to admit, in very small type, that various computations yield results which are 1044, 1056 and 10112 times greater than observation. These figures represent the greatest differences yet between physical calculations and reality and point to some difficulty with the calculations. Frank Wilczek: The Lightness of Being: Mass, Ether, and the Unification of Forces

Insight is considered to be an act of intelligence and we are inclined to think of ourselves as the only really intelligent species on the planet. Here, however we see insight and intelligence at all levels of the Universe. We have already speculated about the origin of time and space to give us 4-dimensional space-time. The first step, from pure action to time and energy lays the foundations for quantum mechanics.

Back to top

6.12. Quantum communication: bosons connect fermion

Although our theories of fundamental particles raise many questions, there is no doubt about their existence and behaviour and we have seen an explosion in technology arising from the application of quantum systems. These technological advances have improved our understanding of the world at all scales from fundamental particles through genetics and molecular physiology to the structure of the universe.

General relativity is one of the most amazing results of the application of continuous mathematics and calculus. At present no quantum field theory of gravitation has been created because it has been found impossible to use renormalization to eliminate the infinites that appear in quantum theory of gravity. Quantum field theories are an attempt to unite quantum theory and special relativity. Trouble in this union starts at the very beginning because these theories have a foot in each of two very different spaces. Hilbert space is the natural home of quantum theory and has next to nothing to do with classical Minkowski space-time; Minkowski space, on the other hand, is the home of special relativity. The central problem is that quantum theory does not provide a mechanism for the creation and annihilation of discrete particles, while special relativity, by establishing the same relationship between momentum and energy as there is between space and time interprets all motion as a process of creation and annihilation.

The usual approach to resolving this dilemma is to assume that space-time is the domain of Hilbert space, so that the Lorentz transformations we use to relate systems in relative inertial motion are also applied to the Hilbert spaces that describe these systems. In 6.6 we imagined that spacetime may be an application of quantum mechanics. This suggests that transforming Hilbert spaces with Lorentz transformations may be placing the cart before the horse. If there is to be a quantum theory of gravitation, we might expect quantum mechanics to be in place before space time structure emerges.

Back to top

6.13: Network QFT, QED and QCD

The most common representation of quantum mechanics is Erwin Schrödinger's wave mechanics. Historically this was the second successful version of quantum mechanics. The first matrix mechanics was published by Heisenberg, Born and Jordan slightly before Schrödinger. It was soon found that both were different mathematical representations of the same idea. The work was continued and perfected by Dirac's transformation theory and von Neumann tidied up the mathematics by using abstract Hilbert space as the domain for quantum mechanics. Schrödinger equation - Wikipedia, Matrix mechanics - Wikipedia, Paul Dirac: The Principles of Quantum Mechanics, John von Neumann: The Mathematical Foundations of Quantum Mechanics (link above)

The various representations of quantum mechanics listed here point to a particular difficulty with the theory. It differs from classical Newtonian mechanics on three points. First it is mostly about particles that are too small to see, whereas classical mechanics deals with planets, moons and other macroscopic objects; second classical mechanics provides clear connections between its mathematical model and physical realities like positions in space, velocities and masses, whereas quantum states are described by vectors in a space with any number of dimensions from one to countable infinity; and third, the inner workings of quantum mechanics are mostly expressed in complex numbers which do not directly correspond to anything observable.

Space adds to time the possibility of two or more things existing at once, spatially separated. Whereas it would be a local contradiction for p and not-p to exist at the same place at the same time, they can both exist at the same time in different places. Traditional theology denies that God is a body which occupies space because they understand that space is infinitely divisible and therefore potential, contradicting the nature of god, which is pure act. The logical definition of space proposed here does not imply potential in the ancient sense, and so is in no way inconsistent with divinity. Aquinas, Summa: I, 3, 1: Is God a body

Nor is space a passive container, as the ancients imagined. It is the home of momentum, and momentum and distance relate to one another very much like energy and time. Whereas energy is the rate of action in time, momentum is the rate of action in space, so space and time are connected by action through the velocity of light. We think that energy came first as as action bifurcated into energy and time, now energy time bifurcates again into momentum and space. Spacetime and energy momentum are mathematically represented in physics in exactly the same way, pivoting around the fundamental metric of the universe which is action.

Descartes made a good start when he noted that clear and distinct ideas are a criterion of truth. In my Scholastic days, one of the watchwords was opportet distinguere: it is necessary to distinguish the different meanings of term to get a clear ideas of what we are talking about. Much of the progress in science can be attributed to sharpening our language as we confront the unbelievable complexity of the universal structure, detailed events stretching right down to the quantum of action. A fly's footprint involves the coordination of trillions of trillions of quanta of action.

The biggest problem facing physics for the last century has been the interface between quantum mechanics and relativistic interface of energy and momentum with time and space. This has led to persistent problem with the appearance of infinities in the mathematics which are very unlikely to actually exist in reality. In his Nobel lecture Feynman characterized renormalization, the technique devised to remove these unlikely infinities, as sweeping problems under the rug:

I don’t think we have a completely satisfactory relativistic quantum-mechanical model, even one that doesn’t agree with nature, but, at least, agrees with the logic that the sum of probability of all alternatives has to be 100%. Therefore, I think that the renormalization theory is simply a way to sweep the difficulties of the divergences of electrodynamics under the rug. I am, of course, not sure of that.
Here, I feel, we owe clarity once again Feynman, who was among the first to realize that quantum mechanics has very little to do with space-time physics and is in fact a description of the substratum of computation that makes the world go round. Misner Thorne and Wheeler touched on this idea with the "pregeometry" and in the last few decades we seen an the explosion of interest in quantum computation, still somewhat muddied by its origins in the continuous mathematics if classical physics. Richard P. Feynman: Nobel Lecture: The Development of the Space-Time View of Quantum Electrodynamics, Misner, Thorne & Wheeler: Gravitation

Renormalization has become so central to field theory that it has become one of the criteria for a valid theory. Current theory sees gravitation as unremormalizable, so it remains outside the standard model. Huang points out the the rot set in when people began to use continuous mathematics to compute the self energy of the electron. Wilson identified the problem as a matter of scale. Here, since we have nothing to lose, we reject continuous mathematics and replace it with quantized action and base our picture on the scale invariance of communication networks. More detailed discussion would take us too far afield for this exploratory essay. Kerson Huang: A Critical History of Renormalization, Kenneth G Wilson: Nobel Lecture: The Renormalisation Group and Critical Phenomena

Let us therefore assume that non-relativistic quantum mechanics describes the logical and computational layer of the universe that emerged on the foundation of energy and time and is itself the foundation of space-time. We have already noted that the foundation of a practical computer is the clock that keeps everything in synchronization, and that the clock itself in in fact a rudimentary computer, executing a string of nots which create a wavelike structure which we can model with a qubit |ψ> = cos(t)|0> + i sin(t)|1> normalized by the fact that cos2(t) - i sin2(t) = 1. Nielsen & Chuang: Quantum Computation and Quantum Information (ref above)

From a picture of the mathematical community (6.3) we my conclude that the real action occurs inside the mathematicians. Scale invariance suggests that the same can be said for fundamental particles, quanta of action embodying computational processes that determine their behaviour when communicating with one another in the cosmic network. We may say the mathematicians are fermions. All their words, questions, papers, books, speeches and videos fall under the heading of bosons. The action goes on inside the fermions and they communicate with one another via the bosons. The principal property of fermions is that they stand aloof and protect their integrity while bosons are happy to share their space. We may see this as the quantum mechanical source of 4D spacetime and the root of gravitation.

Although fundamental particles have zero size they are nevertheless capable of representing computations by logical confinement which is a consequence of fixed point theory. Each particle is an image of the universe, closed, logically continuous, convex and animated by a quantum of action.

Quantum mechanics works well in low energy situations where, apart from photons, particles are moving slowly compared to the velocity if light. Apart from photons (once more) the particles involved can be considered as permanent structures. As velocities and energies go up, however things change. First, since mass and energy are effectively identical, massive particles can be created and annihilated; and second, there is a complex issue of virtual particles, that is unobservable particles which even at relatively low energies can be created and annihilated within a quantum pixel because this is allowed by Heisenberg's uncertainty principle as long as their energy ΔE is inversely proportional to their lifetime ΔE so that ΔE × ΔEh, where h is Planck's constant. Uncertainty principle - Wikipedia

A further problem arises because it is believed that no causal influence can travel through space at a velocity greater than light so that the actual causal interactions of quantum systems are local, assumed to occur between fundamental particles of zero size in contact with one another. This interaction is explained by quantum field theory, which is worked out in the continuous complex space of quantum fields. These fields are mathematical functions built on the foundation of continuous spacetime.

The finite size of the quantum of action and the relationships between energy, time, distance and momentum mean that if we want to see processes that occur very quickly in very small regions of space-time, we must observe them at very high energy and momentum. This is why the physics community has been driven over the last century to devise machinery operating at ever greater levels of energy and momentum to observe particles at the smallest possible scale. These machines are in effect giant microscopes, some many kilometres in size.

Although this machinery works by the laws of nature and is capable of imitating conditions inside the sun, and even very close to the first moments if the expanding universe, the mathematical theory has problems with infinity. Although the universe clearly works as a consistent whole, we cannot yet say the same about our mathematical models of the universe. It is widely recognised that the theory we have so far is a temporary expedient en route to a complete theory of the fundamental structure of the world. We know enough, however, to live in an era of explosive development of new technologies based on quantum theory.

Back to top

6.14: Corruption and death

Aristotle and his contemporaries thought that the intellectual element of a human being is immortal or eternal. This is because they thought of matter as something inert and incapable of understanding, so that the understanding part of our minds must be spiritual. They thought that spiritual beings, because they had no parts, were incapable of coming apart, and so everlasting.

Quite reasonable, really, but impossible and so wrong.

The transfinite numbers explain imagination and variation. From a formal mathematical point of view we find no problem imagining the infinite set of natural numbers and the transfinite sets that are built on them. We observe in reality, however, that all information is represented physically and the physical world is very complex, a veritable haystack of signals travelling between billions of sources. Where signals cross they may become corrupted. In this case sources that rely on the reliable signalling may lose contact with one another, leading to a breakdown in the bond between them. In the darker realms of the human world, bad agents may do what they can to corrupt signals to break up human relationships, businesses or political parties. The internet has provided extensive new opportunities for such activity, often allowing perpetrators to hide themselves by encryption and other means. Propaganda - Wikipedia

In living organisms many structural errors can be corrected by the death and recycling of damaged cells and the reproduction of pristine replacements. This does not apply to all tissues however, and so we see the effects of ageing in our skin and hair and feel them in increasing stiffness and weakness of our bodies. These natural processes are supplemented by disease and accident so that our ultimate fate is death through fatal error of one sort or another. Ancient dreams of eternal life do not have a place in modern biology.

Back to top

6.15: Evil

The President of the United States, Ronald Reagan called the Soviet Union an evil empire, while at the same time trying to encourage Gorbachev and the Soviet leadership to liberalize their regime and convert to the American way of life. His words were another step in the desire long held to free Soviet citizens from oppression, rather as American liberals had ultimately hoped free the slaves.

Deception is one of the important tools of evolutionary survival. Plants and animals often use it. Among the functions of deception are both escape from predation and the capture of prey. From the point of view of a hunter, a successful hunt is a good; from the point of view of the prey, it is an evil. Good and evil are thus indissolubly linked in the biological world.

The network model suggests that there are two roots of evil. One lies lies in the small size of the computable space when compared to all possible space. We may say that the computable space is full, and that this fullness is represented by the conservation of energy through time. Quantum mechanics identifies energy with frequency, and here we like to link frequency and processing rate. As a consequence of the conservation of energy we have the conservation of matter.

When we are in computable space, that is in space that we can control, evil can be avoided. This is the fundamental rule of road safety: maintain control of the vehicle. The same principle applies to occupational health and safety: we seek to have no uncontrolled events ('accidents') in the workplace. Ideally, there is a tested procedure for every action and enough control to make sure things proceed as planned.

Since all information is represented physically, the availability of the physical resource necessary to represent the information places a limitation on the processing power systems. So we find in practical computers limitations on physical memory and processing speed. On the other hand, a person needs a certain amount of nutrition and a certain amount of body mass to survive. From this point of view, we may see all evil as a form of starvation, insufficient resources being available to prevent error.

The second source of evil lies in the hierarchical structure in the universe. In general lower layers are more energetic and violent that higher layers. A bullet tearing through flesh is acting naturally for a bullet, a high energy lump of metal, but it is doing evil from the point view of a living body. This source of evil is particularly important in the general area of occupational health and safety since a large proportion of our industrial processes, require high energy, high pressure, work at great height and motion at great speeds, all of which can get out of hand.

Most animals kill other animals and plants to eat, but humans, with our superior intelligence, have learnt to kill for more sophisticated reasons. So farmers kill weeds, insect and animal pests. Raiders and warlords, on the other hand, are inclined to kill farmers to take their food and land. With improvements in weapons and methods of war, some warlords have been able to establish hegemony over large numbers of people and ultimately establish large empires. Much of recorded human history is the history of the wars of warlords, kings and emperors.

We may place evils on a spectrum ranging from what we might call natural evils like those implicit in survival by hunting and gathering, disease and natural disasters, to human specific evils that run from lethal domestic violence to full scale war on the one hand and mild environmental modifications to wholesale destruction of ecosystems on the other. We will turn to human evil in chapter 9.

Bad things die out, although this may take along time, as we watch the slow reduction of poverty and disease in the world. This reduction is often impeded by the selfish behaviour of individuals and groups in the world who would prefer to destroy rather than to build for political reasons. We turn to politics in chapter 8.

Back to top

16. Does the model fit the world?

Many feel that the world was eternal: is has always been like this and the question of its origin is meaningless. Others feel that it had a beginning, and are compelled therefore to wonder how it came to be this way. The Christian tradition answers this question with a simple hypothesis: there is a narcissistic omniscient and omnipotent being who created this universe and populated it with intelligent beings that would bring glory to their creator by worshipping them. Although this idea is very durable and was probably motivated by the existence of narcissistic warlords who were likely to kill those who did not bow low enough, it may not seem very plausible.

The big bang hypothesis does not seem much better. There had to be something to go bang and since it is somewhat plausible that space and time began with the big bang, the prior state must have existed without space and time and so, like the god of Aquinas, exist without the necessary structure to be either omniscient or omnipotent.

The assumption in the big bang hypothesis that the precursor of the universe comprised all the energy of the current universe raises two further difficulties. First, can we make any sense of an entity of zero size containing a huge amount of energy, implying that its energy density is infinite? This is a generic problem for modern physics, since all the proposed fundamental particles also have zero size and finite energy, implying infinite energy density. The second problem is that since energy is conjugate to time, the initial entity might have had no size, but like all the energetic fundamental particles, did exist in time.

High energy physicists use their machinery to create energy bubbles of finite size which decay into fundamental particles. Millions of experimental collisions have shown that there is a limited spectrum of these particles and they all have well defined properties. How are we to account for all this structure arising from what we take to be a more or less structureless blob of energy?

Much of the information we have about the relationships of particles to one another is consistent with the mathematical theory of groups, but we also rely on a lot of unexplained empirical information about the masses, charges and other features of these particles. It is clear that we yet have a lot to learn.

I see this essay so far as a string of obiter dicta (Latin for travelogue) written while wandering around in the space of problems raised by the relationship between god and physics. We will now turn to exploring whatever lessons have been learnt on this journey to our own roles as particles in a divine universe. We know that we came to be the way we are by a very long process of evolution. It may be that the nature of the fundamental constituents of the universe will ultimately be explained by a similar process.

(Revised 24 December 2020)

Back to top

Back to table of contents

Copyright:

You may copy this material freely provided only that you quote fairly and provide a link (or reference) to your source.

Further reading

Books

Augustine, Saint, and Edmond Hill (Introduction, translation and notes), and John E Rotelle (editor), The Trinity, New City Press 399-419, 1991 Written 399 - 419: De Trinitate is a radical restatement, defence and development of the Christian doctrine of the Trinity. Augustine's book has served as a foundation for most subsequent work, particularly that of Thomas Aquinas.  
Amazon
  back

Auyang, Sunny Y., How is Quantum Field Theory Possible?, Oxford University Press 1995 Jacket: 'Quantum field theory (QFT) combines quantum mechanics with Einstein's special theory of relativity and underlies elementary particle physics. This book presents a philosophical analysis of QFT. It is the first treatise in which the philosophies of space-time, quantum phenomena and particle interactions are encompassed in a unified framework.' 
Amazon
  back

Barrow, John D., and Frank J. Tipler, The Anthropic Cosmological Principle, Oxford University Press 1996 'This wide-ranging and detailed book explores the many ramifications of the Anthropic Cosmological Principle, covering the whole spectrum of human inquiry from Aristotle to Z bosons. Bringing a unique combination of skills and knowledge to the subject, John D. Barrow and Frank J. Tipler - two of the world's leading cosmologists - cover the definition and nature of life, the search for extraterrestrial intelligence, and the interpretation of the quantum theory in relation to the existence of observers.' 
Amazon
  back

Bastin, Ted, and C W Kilmister, Combinatorial Physics, World Scientific 1995 About this book (World Scientific) 'The authors aim to reinstate a spirit of philosophical enquiry in physics. They abandon the intuitive continuum concepts and build up constructively a combinatorial mathematics of process. This radical change alone makes it possible to calculate the coupling constants of the fundamental fields which — via high energy scattering — are the bridge from the combinatorial world into dynamics. The untenable distinction between what is ‘observed’, or measured, and what is not, upon which current quantum theory is based, is not needed. If we are to speak of mind, this has to be present — albeit in primitive form — at the most basic level, and not to be dragged in at one arbitrary point to avoid the difficulties about quantum observation. There is a growing literature on information-theoretic models for physics, but hitherto the two disciplines have gone in parallel. In this book they interact vitally.' 
Amazon
  back

Carlson, James, and Arthur Jaffe & Andrew Wiles, The Millennium Prize Problems, ClayMathematics Institute and American Mathematical Society 2006
1: The Birch and Swinnerton-Dyer Conjecture: Andrew Wiles
2: The Hodge Conjecture: Pierre Deligne
3: The Existence and Smoothness of the Navier-Stokes Equation: Charles L Fefferman
4: The Poincare Conjecture: John Milnor
5: The P versus NP Problem: Stephen Cook
6: The Riemann Hypothesis: Enrico Bombieri
7: Quantum Yang-Mills Theory: Arthur Jaffe and Edward Whitten 
Amazon
  back

Chaitin, Gregory J, Information, Randomness & Incompleteness: Papers on Algorithmic Information Theory, World Scientific 1987 Jacket: 'Algorithmic information theory is a branch of computational complexity theory concerned with the size of computer programs rather than with their running time. . . . The theory combines features of probability theory, information theory, statistical mechanics and thermodynamics, and recursive function or computability theory. ... [A] major application of algorithmic information theory has been the dramatic new light it throws on Goedel's famous incompleteness theorem and on the limitations of the axiomatic method. . . .' 
Amazon
  back

Damasio, Antonio R, The Feeling of What Happens: Body and Emotion in the Making of Consciousness, Harcourt Brace 1999 Jacket: 'In a radical departure from current views on consciousness, Damasio contends that explaining how we make mental images or attend to those images will not suffice to elucidate the mystery. A satisfactory hypothesis for the making of consciousness must explain how the sense of self comes to mind. Damasio suggests that the sense of self does not depend on memory or on reasoning or even less on language. [it] depends, he argues, on the brain's ability to portray the living organism in the act of relating to an object. That ability, in turn, is a consequence of the brain's involvement in the process of regulating life. The sense of self began as yet another device aimed an ensuring survival.' 
Amazon
  back

Darwin, Charles, and Greg Suriano (editor), The Origin of Species, Gramercy 1998 Introduction: 'In considering the Origin of Species, it is quite conceivable that a naturalist, reflecting on the mutual affinities of organic beings, on their embryological relations, their geographical distribution, geological succession, and other such facts, might come to the conclusion that each species has not been independently created, but has descended, like varieties, from other species.' 
Amazon
  back

Dirac, P A M, The Principles of Quantum Mechanics (4th ed), Oxford UP/Clarendon 1983 Jacket: '[this] is the standard work in the fundamental principles of quantum mechanics, indispensible both to the advanced student and the mature research worker, who will always find it a fresh source of knowledge and stimulation.' (Nature)  
Amazon
  back

Feynman, Richard, Feynman Lectures on Computation, Perseus Publishing 2007 Amazon Editorial Reviews Book Description 'The famous physicist's timeless lectures on the promise and limitations of computers When, in 1984-86, Richard P. Feynman gave his famous course on computation at the California Institute of Technology, he asked Tony Hey to adapt his lecture notes into a book. Although led by Feynman, the course also featured, as occasional guest speakers, some of the most brilliant men in science at that time, including Marvin Minsky, Charles Bennett, and John Hopfield. Although the lectures are now thirteen years old, most of the material is timeless and presents a "Feynmanesque" overview of many standard and some not-so-standard topics in computer science such as reversible logic gates and quantum computers.'  
Amazon
  back

Feynman (1988), Richard, QED: The Strange Story of Light and Matter, Princeton UP 1988 Jacket: 'Quantum electrodynamics - or QED for short - is the 'strange theory' that explains how light and electrons interact. Thanks to Richard Feynmann and his colleagues, it is also one of the rare parts of physics that is known for sure, a theory that has stood the test of time. . . . In this beautifully lucid set of lectures he provides a definitive introduction to QED.' 
Amazon
  back

Feynman (2002), Richard, Feynman Lectures on Gravitation, Westview Press 2002 Amazon Editorial Reviews Book Description 'The Feynman Lectures on Gravitation are based on notes prepared during a course on gravitational physics that Richard Feynman taught at Caltech during the 1962-63 academic year. For several years prior to these lectures, Feynman thought long and hard about the fundamental problems in gravitational physics, yet he published very little. These lectures represent a useful record of his viewpoints and some of his insights into gravity and its application to cosmology, superstars, wormholes, and gravitational waves at that particular time. The lectures also contain a number of fascinating digressions and asides on the foundations of physics and other issues. Characteristically, Feynman took an untraditional non-geometric approach to gravitation and general relativity based on the underlying quantum aspects of gravity. Hence, these lectures contain a unique pedagogical account of the development of Einstein's general theory of relativity as the inevitable result of the demand for a self-consistent theory of a massless spin-2 field (the graviton) coupled to the energy-momentum tensor of matter. This approach also demonstrates the intimate and fundamental connection between gauge invariance and the principle of equivalence.' 
Amazon
  back

Hawking, Steven W, and G F R Ellis, The Large Scale Structure of Space-Time, Cambridge UP 1975 Preface: Einstein's General Theory of Relativity . . . leads to two remarkable predictions about the universe: first that the final fate of massive stars is to collapse behind an event horizon to form a 'black hole' which will contain a singularity; and secondly that there is a singularity in our past which constitutes, in some sense, a beginning to our universe. Our discussion is principally aimed at developing these two results.' 
Amazon
  back

Heyes, Cecilia, Cognitive Gadgets: The Cultural Evolution of Thinking, Belknap Press: Harvard University Press 2018 “Cecilia Heyes presents a new hypothesis to explain the one feature that distinguishes Homo sapiens from all other species: the mind. Through lucid, compelling writing, this masterly exegesis proposes that the key features of the human mind, termed ‘cognitive gadgets,’ are the products of cultural rather than genetic evolution. It will stimulate its readers to think deeply, as Heyes has done, about what it means to be human.”―Lord John Krebs, University of Oxford 
Amazon
  back

Joseph, George Gheverghese, The Crest of the Peacock: Non-European Roots of Mathematics, Princeton University Press 2010 'From the Ishango Bone of central Africa and the Inca quipu of South America to the dawn of modern mathematics, The Crest of the Peacock makes it clear that human beings everywhere have been capable of advanced and innovative mathematical thinking. George Gheverghese Joseph takes us on a breathtaking multicultural tour of the roots and shoots of non-European mathematics. He shows us the deep influence that the Egyptians and Babylonians had on the Greeks, the Arabs' major creative contributions, and the astounding range of successes of the great civilizations of India and China.' 
Amazon
  back

Lonergan, Bernard J F, and Robert M. Doran, Frederick E. Crowe (eds), Verbum : Word and Idea in Aquinas (Collected Works of Bernard Lonergan volume 2), University of Toronto Press 1997 Jacket: 'Verbum is a product of Lonergan's eleven years of study of the thought of Thomas Aquinas. The work is considered by many to be a breakthrough in the history of Lonergan's theology . . .. Here he interprets aspects in the writing of Aquinas relevant to trinitarian theory and, as in most of Lonergan's work, one of the principal aims is to assist the reader in the search to understand the workings of the human mind.' 
Amazon
  back

Lonergan, Bernard J F, Insight: A Study of Human Understanding (Collected Works of Bernard Lonergan : Volume 3), University of Toronto Press 1992 '. . . Bernard Lonergan's masterwork. Its aim is nothing less than insight into insight itself, an understanding of understanding' 
Amazon
  back

Lonergan (2007), Bernard J F, and Michael G Shields (translator), Robert M Doran & H Daniel Monsour (editors), The Triune God: Systematics, University of Toronto press 2007 De Deo trino, or The Triune God, is the third great instalment on one particular strand in trinitarian theology, namely, the tradition that appeals to a psychological analogy for understanding trinitarian processions and relations. The analogy dates back to St Augustine but was significantly developed by St Thomas Aquinas. Lonergan advances it to a new level of sophistication by rooting it in his own highly nuanced cognitional theory and in his early position on decision and love. . . . This is truly one of the great masterpieces in the history of systematic theology, perhaps even the greatest of all time.' 
Amazon
  back

Lonergan (Collected Works 11), Bernard J F, and Robert M Doran and H Daniel Monsour (eds), The Triune God: Doctrines (Volume 11 of Collected Works), University of Toronto Press 2009 Bernard Lonergan (1904-1984), a professor of theology, taught at Regis College, Harvard University, and Boston College. An established author known for his Insight and Method in Theology, Lonergan received numerous honorary doctorates, was a Companion of the Order of Canada in 1971 and was named as an original members of the International Theological Commission by Pope Paul VI. 
Amazon
  back

Lonergan (Collected Works 12), Bernard J F, and Michael G Shields (translator), Robert M Doran & H Daniel Monsour (editors), The Triune God: Systematics, University of Toronto press 2007 Translated from De Deo Trino: Pars systematica (1964) by Michael G Shields. Amazon Product Description 'Buried for more than forty years in a Latin text written for seminarian students at the Gregorian University in Rome, Bernard Lonergan's 1964 masterpiece of systematic-theological writing, De Deo trino: Pars systematica, is only now being published in an edition that includes the original Latin along with an exact and literal translation. De Deo trino, or The Triune God, is the third great instalment on one particular strand in trinitarian theology, namely, the tradition that appeals to a psychological analogy for understanding trinitarian processions and relations. The analogy dates back to St Augustine but was significantly developed by St Thomas Aquinas. Lonergan advances it to a new level of sophistication by rooting it in his own highly nuanced cognitional theory and in his early position on decision and love. Suggestions for a further development of the analogy appear in Lonergan's late work, but these cannot be understood and implemented without working through this volume. This is truly one of the great masterpieces in the history of systematic theology, perhaps even the greatest of all time.' 
Amazon
  back

Misner, Charles W, and Kip S Thorne, John Archibald Wheeler, Gravitation, Freeman 1973 Jacket: 'Einstein's description of gravitation as curvature of spacetime led directly to that greatest of all predictions of his theory, that the universe itself is dynamic. Physics still has far to go to come to terms with this amazing fact and what it means for man and his relation to the universe. John Archibald Wheeler. . . . this is a book on Einstein's theory of gravity. . . . ' 
Amazon
  back

Neuenschwander, Dwight E, Emmy Noether's Wonderful Theorem, Johns Hopkins University Press 2011 Jacket: A beautiful piece of mathematics, Noether's therem touches on every aspect of physics. Emmy Noether proved her theorem in 1915 and published it in 1918. This profound concept demonstrates the connection between conservation laws and symmetries. For instance, the theorem shows that a system invariant under translations of time, space or rotation will obey the laws of conservation of energy, linear momentum or angular momentum respectively. This exciting result offers a rich unifying principle for all of physics.' 
Amazon
  back

Nielsen, Michael A, and Isaac L Chuang, Quantum Computation and Quantum Information, Cambridge University Press 2000 Review: A rigorous, comprehensive text on quantum information is timely. The study of quantum information and computation represents a particularly direct route to understanding quantum mechanics. Unlike the traditional route to quantum mechanics via Schroedinger's equation and the hydrogen atom, the study of quantum information requires no calculus, merely a knowledge of complex numbers and matrix multiplication. In addition, quantum information processing gives direct access to the traditionally advanced topics of measurement of quantum systems and decoherence.' Seth Lloyd, Department of Quantum Mechanical Engineering, MIT, Nature 6876: vol 416 page 19, 7 March 2002. 
Amazon
  back

Noyes, H. Pierre, and J. C. van den Berg, Bit-String Physics: A Finite and Discrete Approach to Natural Philosophy, World Scientific 2001 'We could be on the threshold of a scientific revolution. Quantum mechanics is based on unique, finite, and discrete events. General relativity assumes a continuous, curved space-time. Reconciling the two remains the most fundamental unsolved scientific problem left over from the last century. The papers of H Pierre Noyes collected in this volume reflect one attempt to achieve that unification by replacing the continuum with the bit-string events of computer science. Three principles are used: physics can determine whether two quantities are the same or different; measurement can tell something from nothing; this structure (modeled by binary addition and multiplication) can leave a historical record consisting of a growing universe of bit-strings. This book is specifically addressed to those interested in the foundations of particle physics, relativity, quantum mechanics, physical cosmology and the philosophy of science 
Amazon
  back

Peacock, John A, Cosmological Physics, Cambridge University Press 1999 Nature Book Review: ' The intermingling of observational detail and fundamental theory has made cosmology an exceptionally rich, exciting and controversial science. Students in the field — whether observers or particle theorists — are expected to be acquainted with matters ranging from the Supernova Ia distance scale, Big Bang nucleosynthesis theory, scale-free quantum fluctuations during inflation, the galaxy two-point correlation function, particle theory candidates for the dark matter, and the star formation history of the Universe. Several general science books, conference proceedings and specialized monographs have addressed these issues. Peacock's Cosmological Physics ambitiously fills the void for introducing students with a strong undergraduate background in physics to the entire world of current physical cosmology. The majestic sweep of his discussion of this vast terrain is awesome, and is bound to capture the imagination of most students.' Ray Carlberg, Nature 399:322 
Amazon
  back

Streater, Raymond F, and Arthur S Wightman, PCT, Spin, Statistics and All That, Princeton University Press 2000 Amazon product description: 'PCT, Spin and Statistics, and All That is the classic summary of and introduction to the achievements of Axiomatic Quantum Field Theory. This theory gives precise mathematical responses to questions like: What is a quantized field? What are the physically indispensable attributes of a quantized field? Furthermore, Axiomatic Field Theory shows that a number of physically important predictions of quantum field theory are mathematical consequences of the axioms. Here Raymond Streater and Arthur Wightman treat only results that can be rigorously proved, and these are presented in an elegant style that makes them available to a broad range of physics and theoretical mathematics.' 
Amazon
  back

't Hooft, Gerardus, 50 Years of Yang Mills Theory, World Scientific 2005 ' On the 50th anniversary of Yang-Mills theory, this invaluable volume looks back at the developments and achievements in elementary particle physics that ensued from that beautiful idea.During the last five decades, Yang-Mills theory, which is undeniably the most important cornerstone of theoretical physics, has expanded widely. It has been investigated from many perspectives, and many new and unexpected features have been uncovered from this theory. In recent decades, apart from high energy physics, the theory has been actively applied in other branches of physics, such as statistical physics, condensed matter physics, nonlinear systems, etc. This makes the theory an indispensable topic for all who are involved in physics.An international team of experts, each of whom has left his mark on the developments of this remarkable theory, contribute essays or more detailed technical accounts to this volume. These articles highlight the new discoveries from the respective authors' perspectives. The distinguished contributors are: S Adler, F A Bais, C Becchi, M Creutz, A De Rujula, B S DeWitt, F Englert, L D Faddeev, P Hasenfratz, R Jackiw, A Polyakov, V N Popov, R Stora, P van Baal, P van Nieuwenhuizen, S Weinberg, F Wilczek, E Witten, C N Yang. Included in each article are introductory and explanatory remarks by the editor, G 't Hooft, who is himself a major player in the development of Yang-Mills theory. 
Amazon
  back

Veltman, Martinus, Diagrammatica: The Path to the Feynman Rules, Cambridge University Press 1994 Jacket: 'This book provides an easily accessible introduction to quantum field theory via Feynman rules and calculations in particle physics. The aim is to make clear what the physical foundations of present-day field theory are, to clarify the physical content of Feynman rules, and to outline their domain of applicability. ... The book includes valuable appendices that review some essential mathematics, including complex spaces, matrices, the CBH equation, traces and dimensional regularization. ...' 
Amazon
  back

von Neumann, John, and Robert T Beyer (translator), Mathematical Foundations of Quantum Mechanics, Princeton University Press 1983 Jacket: '. . . a revolutionary book that caused a sea change in theoretical physics. . . . JvN begins by presenting the theory of Hermitean operators and Hilbert spaces. These provide the framework for transformation theory, which JvN regards as the definitive form of quantum mechanics. . . . Regarded as a tour de force at the time of its publication, this book is still indispensable for those interested in the fundamental issues of quantum mechanics.' 
Amazon
  back

Wiener, Norbert, Cybernetics or Control and Communication in the Animal and the Machine, MIT Press 1996 The classic founding text of cybernetics. 
Amazon
  back

Wilczek, Frank, The Lightness of Being: Mass, Ether, and the Unification of Forces, Basic Books 2008 ' In this excursion to the outer limits of particle physics, Wilczek explores what quarks and gluons, which compose protons and neutrons, reveal about the manifestation of mass and gravity. A corecipient of the 2004 Nobel Prize in Physics, Wilczek knows what he’s writing about; the question is, will general science readers? Happily, they know what the strong interaction is (the forces that bind the nucleus), and in Wilczek, they have a jovial guide who adheres to trade publishing’s belief that a successful physics title will not include too many equations. Despite this injunction (against which he lightly protests), Wilczek delivers an approachable verbal picture of what quarks and gluons are doing inside a proton that gives rise to mass and, hence, gravity. Casting the light-speed lives of quarks against “the Grid,” Wilczek’s term for the vacuum that theoretically seethes with quantum activity, Wilczek exudes a contagious excitement for discovery. A near-obligatory acquisition for circulating physics collections.' --Gilbert Taylor  
Amazon
  back

Zee, Anthony, Quantum Field Theory in a Nutshell, Princeton University Press 2003 Amazon book description: 'An esteemed researcher and acclaimed popular author takes up the challenge of providing a clear, relatively brief, and fully up-to-date introduction to one of the most vital but notoriously difficult subjects in theoretical physics. A quantum field theory text for the twenty-first century, this book makes the essential tool of modern theoretical physics available to any student who has completed a course on quantum mechanics and is eager to go on. Quantum field theory was invented to deal simultaneously with special relativity and quantum mechanics, the two greatest discoveries of early twentieth-century physics, but it has become increasingly important to many areas of physics. These days, physicists turn to quantum field theory to describe a multitude of phenomena. Stressing critical ideas and insights, Zee uses numerous examples to lead students to a true conceptual understanding of quantum field theory--what it means and what it can do. He covers an unusually diverse range of topics, including various contemporary developments,while guiding readers through thoughtfully designed problems. In contrast to previous texts, Zee incorporates gravity from the outset and discusses the innovative use of quantum field theory in modern condensed matter theory. Without a solid understanding of quantum field theory, no student can claim to have mastered contemporary theoretical physics. Offering a remarkably accessible conceptual introduction, this text will be widely welcomed and used.  
Amazon
  back

Links

Action (physics) - Wikipedia, Action (physics) - Wikipedia, the free encyclopedia, 'In physics, action is an attribute of the dynamics of a physical system from which the equations of motion of the system can be derived. It is a mathematical functional which takes the trajectory, also called path or history, of the system as its argument and has a real number as its result. Generally, the action takes different values for different paths. Action has the dimensions of energy.time or momentum.length], and its SI unit is joule-second.' back

Albert Einstein - Wikipedia, Albert Einstein - Wikipedia, the free encyclopedia, Albert Einstein (14 March 1879 – 18 April 1955) was an ethnically Jewish German-born theoretical physicist. He is best known for his theories of special relativity and general relativity. Einstein received the 1921 Nobel Prize in Physics "for his services to Theoretical Physics, and especially for his discovery of the law of the photoelectric effect."' back

Andrew David Irvine (Stanford Encyclopedia of Philosophy), Principia Mathematica, 'Principia Mathematica, the landmark work in formal logic written by Alfred North Whitehead and Bertrand Russell, was first published in three volumes in 1910, 1912 and 1913. . . . Written as a defense of logicism (the thesis that mathematics is in some significant sense reducible to logic), the book was instrumental in developing and popularizing modern mathematical logic. It also served as a major impetus for research in the foundations of mathematics throughout the twentieth century. Along with Aristotle's Organon and Gottlob Frege's Grundgesetze der Arithmetik, it remains one of the most influential books on logic ever written.' ' back

Andrew Hodges, Alan Turing: The Enigma, 'Founder of computer science, mathematician, philosopher, codebreaker, strange visionary and a gay man before his time:' back

Anthropic principle - Wikipedia, Anthropic principle - Wikipedia, the free encyclopedia, 'The anthropic principle (from Greek anthropos, meaning "human") is the philosophical consideration that observations of the universe must be compatible with the conscious and sapient life that observes it. Some proponents of the anthropic principle reason that it explains why the universe has the age and the fundamental physical constants necessary to accommodate conscious life. As a result, they believe it is unremarkable that the universe's fundamental constants happen to fall within the narrow range thought to be compatible with life.' back

Apophatic theology - Wikipedia, Apophatic theology - Wikipedia, the free encyclopedia, 'Apophatic theology (from Greek ἀπόφασις from ἀπόφημι - apophēmi, "to deny")—also known as negative theology or via negativa (Latin for "negative way")—is a theology that attempts to describe God, the Divine Good, by negation, to speak only in terms of what may not be said about the perfect goodness that is God. It stands in contrast with cataphatic theology.' back

Aquinas Summa I, 2, 3, Does God exist?, 'I answer that, The existence of God can be proved in five ways. The first and more manifest way is the argument from motion. . . . ' back

Aquinas, Summa I, 15, 1, Are there are ideas, 'I answer that, It is necessary to suppose ideas in the divine mind. For the Greek word Idea is in Latin "forma." ' back

Aquinas, Summa, I II, 3, 8, Does human happiness consist in the vision of the divine essence?, 'If therefore the human intellect, knowing the essence of some created effect, knows no more of God than "that He is"; the perfection of that intellect does not yet reach simply the First Cause, but there remains in it the natural desire to seek the cause. Wherefore it is not yet perfectly happy. Consequently, for perfect happiness the intellect needs to reach the very Essence of the First Cause.' back

Aquinas, Summa, I, 22, 3, Does God have immediate providence over everything?, 'I answer that, Two things belong to providence -- namely, the type of the order of things foreordained towards an end; and the execution of this order, which is called government. As regards the first of these, God has immediate providence over everything, because He has in His intellect the types of everything, even the smallest; and whatsoever causes He assigns to certain effects, He gives them the power to produce those effects.' back

Aquinas, Summa, I, 27, 1, Is there procession in God?, 'As God is above all things, we should understand what is said of God, not according to the mode of the lowest creatures, namely bodies, but from the similitude of the highest creatures, the intellectual substances; while even the similitudes derived from these fall short in the representation of divine objects. Procession, therefore, is not to be understood from what it is in bodies, either according to local movement or by way of a cause proceeding forth to its exterior effect, as, for instance, like heat from the agent to the thing made hot. Rather it is to be understood by way of an intelligible emanation, for example, of the intelligible word which proceeds from the speaker, yet remains in him. In that sense the Catholic Faith understands procession as existing in God.' back

Aquinas, Summa, I, 3, 4, Are essence and existence the same in God?, 'I answer that, God is not only His own essence, as shown in the preceding article, but also His own existence. This may be shown in several ways. First, whatever a thing has besides its essence must be caused either by the constituent principles of that essence (like a property that necessarily accompanies the species--as the faculty of laughing is proper to a man--and is caused by the constituent principles of the species), or by some exterior agent--as heat is caused in water by fire. Therefore, if the existence of a thing differs from its essence, this existence must be caused either by some exterior agent or by its essential principles. Now it is impossible for a thing's existence to be caused by its essential constituent principles, for nothing can be the sufficient cause of its own existence, if its existence is caused. Therefore that thing, whose existence differs from its essence, must have its existence caused by another. But this cannot be true of God; because we call God the first efficient cause. Therefore it is impossible that in God His existence should differ from His essence.' back

Aquinas, Summa, I, 3, 7, Is God altogether simple?, 'I answer that, The absolute simplicity of God may be shown in many ways. First, from the previous articles of this question. For there is neither composition of quantitative parts in God, since He is not a body; nor composition of matter and form; nor does His nature differ from His "suppositum"; nor His essence from His existence; neither is there in Him composition of genus and difference, nor of subject and accident. Therefore, it is clear that God is nowise composite, but is altogether simple. . . . ' back

Aquinas, Summa, I, 34, 2, Is "Word" the Son's proper name?, ' I answer that, "Word," said of God in its proper sense, is used personally, and is the proper name of the person of the Son. For it signifies an emanation of the intellect: and the person Who proceeds in God, by way of emanation of the intellect, is called the Son; and this procession is called generation. Hence it follows that the Son alone is properly called Word in God.' back

Aquinas, Summa, I, 37, 1, Is "Love" the proper name of the Holy Spirit?, ' It follows that so far as love means only the relation of the lover to the object loved, "love" and "to love" are said of the essence, as "understanding" and "to understand"; but, on the other hand, so far as these words are used to express the relation to its principle, of what proceeds by way of love, and "vice versa," so that by "love" is understood the "love proceeding," and by "to love" is understood "the spiration of the love proceeding," in that sense "love" is the name of the person and "to love" is a notional term, as "to speak" and "to beget." ' back

Aquinas, Summa, I, 8, 1, Is God in all things?, ' I answer that, God is in all things; not, indeed, as part of their essence, nor as an accident, but as an agent is present to that upon which it works. For an agent must be joined to that wherein it acts immediately and touch it by its power; hence it is proved in Phys. vii that the thing moved and the mover must be joined together. Now since God is very being by His own essence, created being must be His proper effect; . . .. Now God causes this effect in things not only when they first begin to be, but as long as they are preserved in being; . . .. Therefore as long as a thing has being, God must be present to it, according to its mode of being. back

Aquinas, Summa: I, 3, 1, Is God is a body? , 'I answer that, It is absolutely true that God is not a body; and this can be shown in three ways. . . . Secondly, because the first being must of necessity be in act, and in no way in potentiality. . . . Now it has been already proved that God is the First Being. It is therefore impossible that in God there should be any potentiality. But every body is in potentiality because the continuous, as such, is divisible to infinity; it is therefore impossible that God should be a body.' back

Aquinas, Summa: I, 14, 1, Is there knowledge in God?, ' I answer that, In God there exists the most perfect knowledge. . . . Therefore it is clear that the immateriality of a thing is the reason why it is cognitive;. . . Since therefore God is in the highest degree of immateriality as stated above it follows that He occupies the highest place in knowledge. back

Aquinas, Summa: I, 25, 3, Is God omnipotent?, ' . . . God is called omnipotent because He can do all things that are possible absolutely; . . . For a thing is said to be possible or impossible absolutely, according to the relation in which the very terms stand to one another, possible if the predicate is not incompatible with the subject, as that Socrates sits; and absolutely impossible when the predicate is altogether incompatible with the subject, as, for instance, that a man is a donkey.' back

Aquinas: Summa: I, 12, 12, Can God be known in this life by natural reason?, 'Respondeo dicendum quod naturalis nostra cognitio a sensu principium sumit, unde tantum se nostra naturalis cognitio extendere potest, inquantum manuduci potest per sensibilia. Ex sensibilibus autem non potest usque ad hoc intellectus noster pertingere, quod divinam essentiam videat, quia creaturae sensibiles sunt effectus Dei virtutem causae non adaequantes. Unde ex sensibilium cognitione non potest tota Dei virtus cognosci, et per consequens nec eius essentia videri. Sed quia sunt eius effectus a causa dependentes, ex eis in hoc perduci possumus, ut cognoscamus de Deo an est; et ut cognoscamus de ipso ea quae necesse est ei convenire secundum quod est prima omnium causa, excedens omnia sua causata.' back

Archaea - Wikipedia, Archaea - Wikipedia, the free encyclopedia, 'The Archaea . . . constitute a domain and kingdom of single-celled microorganisms. These microbes . . . are prokaryotes, meaning they have no cell nucleus or any other membrane-bound organelles in their cells. Archaea were initially classified as bacteria, receiving the name archaebacteria (in the Archaebacteria kingdom), but this classification is outdated. Archaeal cells have unique properties separating them from the other two domains of life, Bacteria and Eukaryota.' back

Archaea - Wikipedia, Archaea - Wikipedia, the free encyclopedia, ' Archaea (singular archaeon) constitute a domain of single-celled organisms. These microorganisms lack cell nuclei and are therefore prokaryotes. Archaea were initially classified as bacteria, receiving the name archaebacteria (in the Archaebacteria kingdom), but this classification is obsolete.[6] Archaeal cells have unique properties separating them from the other two domains, Bacteria and Eukaryota. Archaea are further divided into multiple recognized phyla. Classification is difficult because most have not been isolated in the laboratory and have been detected only by analysis of their nucleic acids in samples from their environment. back

Archaeoastronomy - Wikipedia, Archaeoastronomy - Wikipedia, the free encyclopedia, 'Archaeoastronomy . . . is the study of how people in the past "have understood the phenomena in the sky, how they used these phenomena and what role the sky played in their cultures."' back

Aristotle - Metaphysics, Internet Classics Archive | Metaphysics by Aristotle, 'ALL men by nature desire to know. An indication of this is the delight we take in our senses; for even apart from their usefulness they are loved for themselves; and above all others the sense of sight. For not only with a view to action, but even when we are not going to do anything, we prefer seeing (one might say) to everything else. The reason is that this, most of all the senses, makes us know and brings to light many differences between things. ' [960a22 sqq] back

Aristotle Metaphysics, Metaphysics XII, vii, 9: 1072 b 25sqq, 'If, then, the happiness which God always enjoys is as great as that which we enjoy sometimes, it is marvellous; and if it is greater, this is still more marvellous. Nevertheless it is so. Moreover, life belongs to God. For the actuality of thought is life, and God is that actuality; and the essential actuality of God is life most good and eternal. We hold, then, that God is a living being, eternal, most good; and therefore life and a continuous eternal existence belong to God; for that is what God is.' back

Aristotle: time, Physics, VIII, 1 (251b12), 'Further, how can there be any 'before' and 'after' without the existence of time? Or how can there be any time without the existence of motion? If, then, time is the number of motion or itself a kind of motion, it follows that, if there is always time, motion must also be eternal.' back

Astrology - Wikipedia, Astrology - Wikipedia, the free encyclopedia, 'Astrology is the study of the movements and relative positions of celestial objects as a means for divining information about human affairs and terrestrial events. Astrology has been dated to at least the 2nd millennium BCE, and has its roots in calendrical systems used to predict seasonal shifts and to interpret celestial cycles as signs of divine communications.' back

Atom - Wikipedia, Atom - Wikipedia, the free encyclopedia, 'The atom is a basic unit of matter that consists of a dense central nucleus surrounded by a cloud of negatively charged electrons. The atomic nucleus contains a mix of positively charged protons and electrically neutral neutrons (except in the case of hydrogen-1, which is the only stable nuclide with no neutrons). The electrons of an atom are bound to the nucleus by the electromagnetic force.' back

Atomic electron transition - Wikipedia, Atomic electron transition - Wikipedia, the free encyclopedia, 'Atomic electron transition is a change of an electron from one quantum state to another within an atom or artificial atom. It appears discontinuous as the electron "jumps" from one energy level to another in a few nanoseconds or less. It is also known as atomic transition, quantum jump, or quantum leap. Electron transitions cause the emission or absorption of electromagnetic radiation in the form of quantized units called photons.' back

Atomic nucleus - Wikipedia, Atomic nucleus - Wikipedia, the free encyclopedia, 'The nucleus of an atom is the very small dense region of an atom, in its centre consisting of nucleons (protons and neutrons). The size (diameter) of the nucleus is in the range of 1.6 fm (10-15 m) (for a proton in light hydrogen) to about 15 fm (for the heaviest atoms, such as uranium). These dimensions are much smaller than the size of the atom itself by a factor of about 23,000 (uranium) to about 145,000 (hydrogen). Almost all of the mass in an atom is made up from the protons and neutrons in the nucleus with a very small contribution from the orbiting electrons.' back

Atomic radius - Wikipedia, Atomic radius - Wikipedia, the free encyclopedia, ' The atomic radius of a chemical element is a measure of the size of its atoms, usually the mean or typical distance from the center of the nucleus to the boundary of the surrounding shells of electrons. Since the boundary is not a well-defined physical entity, there are various non-equivalent definitions of atomic radius.' back

Big Bang - Wikipedia, Big Bang - Wikipedia, the free encyclopedia, 'The Big Bang theory is the prevailing cosmological model that explains the early development of the Universe. According to the Big Bang theory, the Universe was once in an extremely hot and dense state which expanded rapidly. This rapid expansion caused the young Universe to cool and resulted in its present continuously expanding state. According to the most recent measurements and observations, this original state existed approximately 13.7 billion years ago, which is considered the age of the Universe and the time the Big Bang occurred. ' back

Black hole - Wikipedia, Black hole - Wikipedia, the free encyclopedia, ' Objects whose gravitational fields are too strong for light to escape were first considered in the 18th century by John Michell and Pierre-Simon Laplace. The first modern solution of general relativity that would characterize a black hole was found by Karl Schwarzschild in 1916, although its interpretation as a region of space from which nothing can escape was first published by David Finkelstein in 1958. Black holes were long considered a mathematical curiosity; it was during the 1960s that theoretical work showed they were a generic prediction of general relativity.' back

Bohr model - Wikipedia, Bohr model - Wikipedia, the free encyclopedia, 'In atomic physics, the Rutherford–Bohr model or Bohr model, introduced by Niels Bohr in 1913, depicts the atom as a small, positively charged nucleus surrounded by electrons that travel in circular orbits around the nucleus—similar in structure to the solar system, but with attraction provided by electrostatic forces rather than gravity' back

Boltzmann's entropy formula - Wikipedia, Boltzmann's entropy formula - Wikipedia, the free encyclopedia, 'In statistical mechanics, Boltzmann's equation is a probability equation relating the entropy S of an ideal gas to the quantity W, which is the number of microstates corresponding to a given macrostate:
S = k ln W
where k is the Boltzmann constant, . . . which is equal to 1.38062 x 10−23 J/K. back

Born rule - Wikipedia, Born rule - Wikipedia, the free encyclopedia, 'The Born rule (also called the Born law, Born's rule, or Born's law) is a law of quantum mechanics which gives the probability that a measurement on a quantum system will yield a given result. It is named after its originator, the physicist Max Born. The Born rule is one of the key principles of the Copenhagen interpretation of quantum mechanics. There have been many attempts to derive the Born rule from the other assumptions of quantum mechanics, with inconclusive results. . . . The Born rule states that if an observable corresponding to a Hermitian operator A with discrete spectrum is measured in a system with normalized wave function (see bra-ket notation), then the measured result will be one of the eigenvalues λ of A, and the probability of measuring a given eigenvalue λi will equal <ψ|Pi|ψ> where Pi is the projection onto the eigenspace of A corresponding to λi'. back

Boson - Wikipedia, Boson - Wikipedia, the free encyclopedia, 'In particle physics, bosons are particles with an integer spin, as opposed to fermions which have half-integer spin. From a behaviour point of view, fermions are particles that obey the Fermi-Dirac statistics while bosons are particles that obey the Bose-Einstein statistics. They may be either elementary, like the photon, or composite, as mesons. All force carrier particles are bosons. They are named after Satyendra Nath Bose. In contrast to fermions, several bosons can occupy the same quantum state. Thus, bosons with the same energy can occupy the same place in space.' back

Brouwer fixed point theorem - Wikipedia, Brouwer fixed point theorem - Wikipedia, the free encyclopedia, ' Brouwer's fixed-point theorem is a fixed-point theorem in topology, named after Luitzen Brouwer. It states that for any continuous function f with certain properties there is a point x0 such that f(x0) = x0. The simplest form of Brouwer's theorem is for continuous functions f from a disk D to itself. A more general form is for continuous functions from a convex compact subset K of Euclidean space to itself.' back

Calculus of variations - Wikipedia, Calculus of variations - Wikipedia, the free encylopedia, 'Calculus of variations is a field of mathematical analysis that deals with maximizing or minimizing functionals, which are mappings from a set of functions to the real numbers. Functionals are often expressed as definite integrals involving functions and their derivatives. The interest is in extremal functions that make the functional attain a maximum or minimum value – or stationary functions – those where the rate of change of the functional is zero.' back

Celestial coordinate system - Wikipedia, Celestial coordinate system - Wikipedia, the free encyclopedia, 'In astronomy, a celestial coordinate system is a system for specifying positions of celestial objects: satellites, planets, stars, galaxies, and so on. Coordinate systems can specify a position in 3-dimensional space, or merely the direction of the object on the celestial sphere, if its distance is not known or not important.' back

Cell division - Wikipedia, Cell division - Wikipedia, the free encyclopedia, 'Cell division is the process by which a parent cell divides into two or more daughter cells. Cell division usually occurs as part of a larger cell cycle. . . . The primary concern of cell division is the maintenance of the original cell's genome. Before division can occur, the genomic information that is stored in chromosomes must be replicated, and the duplicated genome must be separated cleanly between cells. A great deal of cellular infrastructure is involved in keeping genomic information consistent between generations.' back

Christopher Shields (Stanford Encyclopedia of Philosophy), The Active Mind of De Anima III 5 , ' After characterizing the mind (nous) and its activities in De Animaiii 4, Aristotle takes a surprising turn. In De Anima iii 5, he introduces an obscure and hotly disputed subject: the active mind or active intellect (nous poiêtikos). Controversy surrounds almost every aspect of De Anima iii 5, not least because in it Aristotle characterizes the active mind—a topic mentioned nowhere else in his entire corpus—as ‘separate and unaffected and unmixed, being in its essence actuality’ (chôristos kai apathês kai amigês, tê ousia energeia; DA iii 5, 430a17–18) and then also as ‘deathless and everlasting’ (athanaton kai aidion; DA iii 5, 430a23). This comes as no small surprise to readers of De Anima, because Aristotle had earlier in the same work treated the mind (nous) as but one faculty (dunamis) of the soul (psuchê), and he had contended that the soul as a whole is not separable from the body (DA ii 1, 413a3–5). back

Christopher Shields (Stanford Encyclopedia of Philosophy), Aristotle , First published Thu Sep 25, 2008 'Aristotle (384–322 B.C.E.) numbers among the greatest philosophers of all time. Judged solely in terms of his philosophical influence, only Plato is his peer: . . . A prodigious researcher and writer, Aristotle left a great body of work, perhaps numbering as many as two-hundred treatises, from which approximately thirty-one survive. His extant writings span a wide range of disciplines, from logic, metaphysics and philosophy of mind, through ethics, political theory, aesthetics and rhetoric, and into such primarily non-philosophical fields as empirical biology, where he excelled at detailed plant and animal observation and taxonomy. In all these areas, Aristotle's theories have provided illumination, met with resistance, sparked debate, and generally stimulated the sustained interest of an abiding readership.' back

Christopher Shields (Stanford Encyclopedia of Philosophy), The Active Mind of De Anima III 5, 'After characterizing the mind (nous) and its activities in De Anima iii 4, Aristotle takes a surprising turn. In De Anima iii 5, he introduces an obscure and hotly disputed subject: the active mind or active intellect (nous poiêtikos). Controversy surrounds almost every aspect of De Anima iii 5, not least because in it Aristotle characterizes the active mind—a topic mentioned nowhere else in his entire corpus—as ‘separate and unaffected and unmixed, being in its essence actuality’ (chôristos kai apathês kai amigês, tê(i) ousia(i) energeia; DA iii 5, 430a17–18) and then also as ‘deathless and everlasting’ (athanaton kai aidion; DA iii 5, 430a23). This comes as no small surprise to readers of De Anima, because Aristotle had earlier in the same work treated the mind (nous) as but one faculty (dunamis) of the soul (psuchê), and he had contended that the soul as a whole is not separable from the body (DA ii 1, 413a3–5). back

Christopher Shields 1996 (Stanford Encyclopedia of Philosophy), The Active Mind of De Anima III 5 , ' After characterizing the mind (nous) and its activities in De Animaiii 4, Aristotle takes a surprising turn. In De Anima iii 5, he introduces an obscure and hotly disputed subject: the active mind or active intellect (nous poiêtikos). Controversy surrounds almost every aspect of De Anima iii 5, not least because in it Aristotle characterizes the active mind—a topic mentioned nowhere else in his entire corpus—as ‘separate and unaffected and unmixed, being in its essence actuality’ (chôristos kai apathês kai amigês, tê ousia energeia; DA iii 5, 430a17–18) and then also as ‘deathless and everlasting’ (athanaton kai aidion; DA iii 5, 430a23). This comes as no small surprise to readers of De Anima, because Aristotle had earlier in the same work treated the mind (nous) as but one faculty (dunamis) of the soul (psuchê), and he had contended that the soul as a whole is not separable from the body (DA ii 1, 413a3–5). back

Circle group - Wikipedia, Circle group - Wikipedia, the free encyclopedia, ' In mathematics, the circle group, denoted by T, is the multiplicative group of all complex numbers with absolute value 1, i.e., the unit circle in the complex plane or simply the unit complex numbers.' back

Citric acid cycle - Wikipedia, Citric acid cycle - Wikipedia, the free encyclopedia, ' The citric acid cycle (CAC) – also known as the TCA cycle (tricarboxylic acid cycle) or the Krebs cycle[1][2] – is a series of chemical reactions used by all aerobic organisms to release stored energy through the oxidation of acetyl-CoA derived from carbohydrates, fats, and proteins. In addition, the cycle provides precursors of certain amino acids, as well as the reducing agent NADH, that are used in numerous other reactions. Its central importance to many biochemical pathways suggests that it was one of the earliest components of metabolism and may have originated abiogenically.' back

Codec - Wikipedia, Codec - Wikipedia, the free encyclopedia, 'A codec is a device or computer program capable of encoding or decoding a digital data stream or signal. Codec is a portmanteau of coder-decoder or, less commonly, compressor-decompressor.' back

Conjecture - Wikipedia, Conjecture - Wikipedia, the free encyclopedia, 'In mathematics, a conjecture is a conclusion or proposition based on incomplete information, for which no proof has been found. Conjectures such as the Riemann hypothesis (still a conjecture) or Fermat's Last Theorem (which was a conjecture until proven in 1995) have shaped much of mathematical history as new areas of mathematics are developed in order to prove them.' back

Conservation of energy - Wikipedia, Conservation of energy - Wikipedia, the free encyclopedia, 'In physics, the law of conservation of energy states that the total energy of an isolated system cannot change—it is said to be conserved over time. Energy can be neither created nor destroyed, but can change form, for instance chemical energy can be converted to kinetic energy in the explosion of a stick of dynamite. back

Copenhagen interpretation - Wikipedia, Copenhagen interpretation - Wikipedia, the free encyclopedia, ' According to the Copenhagen interpretation, physical systems generally do not have definite properties prior to being measured, and quantum mechanics can only predict the probability distribution of a given measurement's possible results. The act of measurement affects the system, causing the set of probabilities to reduce to only one of the possible values immediately after the measurement. This feature is known as wave function collapse.' back

Cosmic microwave background - Wikipedia, Cosmic microwave background - Wikipedia, the free encyclopedia, 'The cosmic microwave background (CMB) is the thermal radiation left over from the time of recombination in Big Bang cosmology. . . . The CMB is a snapshot of the oldest light in our Universe, imprinted on the sky when the Universe was just 380,000 years old. It shows tiny temperature fluctuations that correspond to regions of slightly different densities, representing the seeds of all future structure: the stars and galaxies of today.' back

Cybernetics - Wikipedia, Cybernetics - Wikipedia, the free encyclopedia, ' Cybernetics is a transdisciplinary approach for exploring regulatory systems, their structures, constraints, and possibilities. Cybernetics is relevant to the study of systems, such as mechanical, physical, biological, cognitive, and social systems. Cybernetics is applicable when a system being analyzed is involved in a closed signaling loop; that is, where action by the system generates some change in its environment and that change is reflected in that system in some manner (feedback) that triggers a system change, originally referred to as a "circular causal" relationship.' back

David J. Gross, Nobel lecture: The Discovery of Asymptotic Freedom and the Emergence of QCD, ' The emergence of QCD is a wonderful example of the evolution from farce to triumph. During a very short period, a transition occurred from experimental discovery and theoretical confusion to theoretical triumph and experimental confirmation. In this Nobel lecture I shall describe the turn of events that led to the discovery of asymptotic freedom, which in turn led to the formulation of QCD, the final element of the remarkably comprehensive theory of elementary particle physics – the Standard Model.I shall then briefly describe the experimental tests of the theory and the implications of asymptotic freedom.' back

Differentiable manifold - Wikipedia, Differentiable manifold - Wikipedia, the free encyclopedia, ' In mathematics, a differentiable manifold is a type of manifold that is locally similar enough to a linear space to allow one to do calculus. Any manifold can be described by a collection of charts, also known as an atlas. One may then apply ideas from calculus while working within the individual charts, since each chart lies within a linear space to which the usual rules of calculus apply. If the charts are suitably compatible (namely, the transition from one chart to another is differentiable), then computations done in one chart are valid in any other differentiable chart.' back

Dimensional analysis - Wikipedia, Dimensional analysis - Wikipedia, the free encyclopedia, 'In engineering and science, dimensional analysis is the analysis of the relationships between different physical quantities by identifying their fundamental dimensions (such as length, mass, time, and electric charge) . . . Any physically meaningful equation (and any inequality and inequation) must have the same dimensions on the left and right sides. Checking this is a common application of performing dimensional analysis. Dimensional analysis is also routinely used as a check on the plausibility of derived equations and computations. It is generally used to categorize types of physical quantities and units based on their relationship to or dependence on other units.' back

Dirac equation - Wikipedia, Dirac equation - Wikipedia, the free encyclopedia, 'In particle physics, the Dirac equation is a relativistic wave equation derived by British physicist Paul Dirac in 1928. In its free form, or including electromagnetic interactions, it describes all spin-1⁄2 massive particles such as electrons and quarks, for which parity is a symmetry, and is consistent with both the principles of quantum mechanics and the theory of special relativity, and was the first theory to account fully for special relativity in the context of quantum mechanics. It accounted for the fine details of the hydrogen spectrum in a completely rigorous way.' back

Earliest known life forms - Wikipedia, Earliest known life forms - Wikipedia, the free encylopedia, ' The earliest known life forms on Earth are putative fossilized microorganisms found in hydrothermal vent precipitates. The earliest time that life forms first appeared on Earth is at least 3.77 billion years ago, possibly as early as 4.28 billion years, or even 4.5 billion years; not long after the oceans formed 4.41 billion years ago, and after the formation of the Earth 4.54 billion years ago. The earliest direct evidence of life on Earth are microfossils of microorganisms permineralized in 3.465-billion-year-old Australian Apex chert rocks. ' back

Eigenvalues and eigenvectors - Wikipedia, Eigenvalues and eigenvectors - Wikipedia, the free encyclopedia, 'An eigenvector of a square matrix A is a non-zero vector vthat, when the matrix multiplies yields a constant multiple of v, the latter multiplier being commonly denoted by λ. That is: Av = λv' back

Einstein field equations - Wikipedia, Einstein field equations - Wikipedia, the free encyclopedia, 'The Einstein field equations (EFE) or Einstein's equations are a set of ten equations in Albert Einstein's general theory of relativity which describe the fundamental interaction of gravitation as a result of spacetime being curved by matter and energy. First published by Einstein in 1915[ as a tensor equation, the EFE equate spacetime curvature (expressed by the Einstein tensor) with the energy and momentum within that spacetime (expressed by the stress-energy tensor).' back

Electricity - Wikipedia, Electricity - Wikipedia, the free encyclopedia, 'Electricity is the set of physical phenomena associated with the presence and flow of electric charge. Electricity gives a wide variety of well-known effects, such as lightning, static electricity, electromagnetic induction and electric current. In addition, electricity permits the creation and reception of electromagnetic radiation such as radio waves.' back

Electromagnetism - Wikipedia, Electromagnetism - Wikipedia, the free encyclopedia, 'Electromagnetism is a branch of physics involving the study of the electromagnetic force, a type of physical interaction that occurs between electrically charged particles. The electromagnetic force usually exhibits electromagnetic fields such as electric fields, magnetic fields, and light and is one of the four fundamental interactions (commonly called forces) in nature. ' back

Elementary particles - Wikipedia, Elementary particles - Wikipedia, the free encyclopedia, 'In particle physics, an elementary particle or fundamental particle is a particle whose substructure is unknown; thus, it is unknown whether it is composed of other particles.' back

Entropy - Wikipedia, Entropy - Wikipedia, the free encyclopedia, 'In statistical mechanics, entropy is an extensive property of a thermodynamic system. It is closely related to the number Ω of microscopic configurations (known as microstates) that are consistent with the macroscopic quantities that characterize the system (such as its volume, pressure and temperature). Under the assumption that each microstate is equally probable, the entropy S is the natural logarithm of the number of microstates, multiplied by the Boltzmann constant kB. Formally (assuming equiprobable microstates), S = k B ln ⁡ Ω . ' back

Entropy (information theory) - Wikipedia, Entropy (information theory) - Wikipedia, the free encyclopedia, 'In information theory, entropy is a measure of the uncertainty associated with a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message, usually in units such as bits. In this context, a 'message' means a specific realization of the random variable. Equivalently, the Shannon entropy is a measure of the average information content one is missing when one does not know the value of the random variable. The concept was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication".' back

Euclid's Elements - Wikipedia, Euclid's Elements - Wikipedia, the free encyclopedia, 'Euclid's Elements (Greek: Στοιχεῖα Stoicheia) is a mathematical and geometric treatise consisting of 13 books written by the Greek mathematician Euclid in Alexandria c. 300 BC. It is a collection of definitions, postulates (axioms), propositions (theorems and constructions), and mathematical proofs of the propositions. The thirteen books cover Euclidean geometry and the ancient Greek version of elementary number theory. The work also includes an algebraic system that has become known as geometric algebra, which is powerful enough to solve many algebraic problems, including the problem of finding the square root.' back

Eugene Wigner, The Unreasonable Effectiveness of Mathematics in the Natural Sciences, 'The first point is that the enormous usefulness of mathematics in the natural sciences is something bordering on the mysterious and that there is no rational explanation for it. Second, it is just this uncanny usefulness of mathematical concepts that raises the question of the uniqueness of our physical theories.' back

Fermion - Wikipedia, Fermion - Wikipedia, the free encyclopedia, 'In particle physics, fermions are particles with a half-integer spin, such as protons and electrons. They obey the Fermi-Dirac statistics and are named after Enrico Fermi. In the Standard Model there are two types of elementary fermions: quarks and leptons. . . . In contrast to bosons, only one fermion can occupy a quantum state at a given time (they obey the Pauli Exclusion Principle). Thus, if more than one fermion occupies the same place in space, the properties of each fermion (e.g. its spin) must be different from the rest. Therefore fermions are usually related with matter while bosons are related with radiation, though the separation between the two is not clear in quantum physics. back

Feynman Lectures on Physics, Vol. I, chapter 22, Algebra, 'In our study of oscillating systems we shall have occasion to use one of the most remarkable, almost astounding, formulas in all of mathematics. From the physicist’s point of view we could bring forth this formula in two minutes or so, and be done with it. But science is as much for intellectual enjoyment as for practical utility, so instead of just spending a few minutes on this amazing jewel, we shall surround the jewel by its proper setting in the grand design of that branch of mathematics which is called elementary algebra.' back

Feynman, Leighton & Sands FLP III:01, Chapter 1: Quantum Behaviour, 'The gradual accumulation of information about atomic and small-scale behavior during the first quarter of the 20th century, which gave some indications about how small things do behave, produced an increasing confusion which was finally resolved in 1926 and 1927 by Schrödinger, Heisenberg, and Born. They finally obtained a consistent description of the behavior of matter on a small scale. We take up the main features of that description in this chapter.' back

Feynman, Leighton & Sands FLP III:04, Chapter 4: Identical Particles, 'In the last chapter we began to consider the special rules for the interference that occurs in processes with two identical particles. By identical particles we mean things like electrons which can in no way be distinguished one from another. If a process involves two particles that are identical, reversing which one arrives at a counter is an alternative which cannot be distinguished and—like all cases of alternatives which cannot be distinguished—interferes with the original, unexchanged case. The amplitude for an event is then the sum of the two interfering amplitudes; but, interestingly enough, the interference is in some cases with the same phase and, in others, with the opposite phase.' back

Feynman, Leighton & Sands FLP III:08, Chapter 8: The Hamiltonian Matrix, 'One problem then in describing nature is to find a suitable representation for the base states. But that’s only the beginning. We still want to be able to say what “happens.” If we know the “condition” of the world at one moment, we would like to know the condition at a later moment. So we also have to find the laws that determine how things change with time. We now address ourselves to this second part of the framework of quantum mechanics—how states change with time.' back

Feynman, Leighton & Sands FLP III:09, Chapter 9: The Ammonia Maser, 'In this chapter we are going to discuss the application of quantum mechanics to a practical device, the ammonia maser. You may wonder why we stop our formal development of quantum mechanics to do a special problem, but you will find that many of the features of this special problem are quite common in the general theory of quantum mechanics, and you will learn a great deal by considering this one problem in detail. The ammonia maser is a device for generating electromagnetic waves, whose operation is based on the properties of the ammonia molecule which we discussed briefly in the last chapter.' back

Feynman, Leighton & Sands, I:22, Feynman Lectures on Physics: I:22 Algebra, 'In our study of oscillating systems we shall have occasion to use one of the most remarkable, almost astounding, formulas in all of mathematics. From the physicist’s point of view we could bring forth this formula in two minutes or so, and be done with it. But science is as much for intellectual enjoyment as for practical utility, so instead of just spending a few minutes on this amazing jewel, we shall surround the jewel by its proper setting in the grand design of that branch of mathematics which is called elementary algebra.' back

First Point of Aries - Wikipedia, First Point of Aries - Wikipedia, the free encyclopedia, ' The First Point of Aries, also known as the Cusp of Aries, is the location of the vernal equinox, used as a reference point in celestial coordinate systems. In diagrams using such coordinate systems, it is often indicated with the symbol ♈︎. Named for the constellation of Aries, it is one of the two points on the celestial sphere at which the celestial equator crosses the ecliptic, the other being the First Point of Libra, located exactly 180° from it.' back

Galaxy formation and evolution - Wikipedia, Galaxy formation and evolution - Wikipedia, the free encyclopedia, 'The study of galaxy formation and evolution is concerned with the processes that formed a heterogeneous universe from a homogeneous beginning, the formation of the first galaxies, the way galaxies change over time, and the processes that have generated the variety of structures observed in nearby galaxies.' back

Galileo Galilei - Wikipedia, Galileo Galilei - Wikipedia, the free encyclopedia, 'Galileo Galilei (. . . 15 February 1564 – 8 January 1642), commonly known as Galileo, was an Italian physicist, mathematician, astronomer, and philosopher who played a major role in the Scientific Revolution. His achievements include improvements to the telescope and consequent astronomical observations and support for Copernicanism. Galileo has been called the "father of modern observational astronomy", the "father of modern physics", the "father of science", and "the Father of Modern Science".' back

General relativity - Wikipedia, General relativity - Wikipedia, the free encyclopedia, 'General relativity or the general theory of relativity is the geometric theory of gravitation published by Albert Einstein in 1916. It is the current description of gravitation in modern physics. General relativity generalises special relativity and Newton's law of universal gravitation, providing a unified description of gravity as a geometric property of space and time, or spacetime. In particular, the curvature of spacetime is directly related to the four-momentum (mass-energy and linear momentum) of whatever matter and radiation are present. The relation is specified by the Einstein field equations, a system of partial differential equations.' back

Genesis, Genesis, from the Holy Bible, King James Version, '1: In the beginning God created the heaven and the earth. 2: And the earth was without form, and void; and darkness was upon the face of the deep. And the Spirit of God moved upon the face of the waters. 3: And God said, Let there be light: and there was light.' back

Geodesic - Wikipedia, Geodesic - Wikipedia, the free encyclopedia, 'In mathematics, particularly differential geometry, a geodesic . . . is a generalization of the notion of a "straight line" to "curved spaces". In the presence of a Riemannian metric, geodesics are defined to be (locally) the shortest path between points in the space. In the presence of an affine connection, geodesics are defined to be curves whose tangent vectors remain parallel if they are transported along it. The term "geodesic" comes from geodesy, the science of measuring the size and shape of Earth; in the original sense, a geodesic was the shortest route between two points on the Earth's surface, namely, a segment of a great circle. . . . Geodesics are of particular importance in general relativity, as they describe the motion of inertial test particles.' back

Gravitation - Wikipedia, Gravitation - Wikipedia, the free encyclopedia, 'Gravitation, or gravity, is a natural phenomenon by which physical bodies attract with a force proportional to their mass. Gravitation is most familiar as the agent that gives weight to objects with mass and causes them to fall to the ground when dropped. Gravitation causes dispersed matter to coalesce, and coalesced matter to remain intact, thus accounting for the existence of the Earth, the Sun, and most of the macroscopic objects in the universe.' back

Gravitational singularity - Wikipedia, Gravitational singularity - Wikipedia, the free encyclopedia, 'A gravitational singularity (sometimes spacetime singularity) is, approximately, a place where quantities which are used to measure the gravitational field become infinite. Such quantities include the curvature of spacetime or the density of matter. More accurately, a spacetime with a singularity contains geodesics which cannot be completed in a smooth manner. The limit of such a geodesic is the singularity.' back

Gregory J. Chaitin, Gödel's Theorem and Information, 'Abstract: Gödel's theorem may be demonstrated using arguments having an information-theoretic flavor. In such an approach it is possible to argue that if a theorem contains more information than a given set of axioms, then it is impossible for the theorem to be derived from the axioms. In contrast with the traditional proof based on the paradox of the liar, this new viewpoint suggests that the incompleteness phenomenon discovered by Gödel is natural and widespread rather than pathological and unusual.'
International Journal of Theoretical Physics 21 (1982), pp. 941-954 back

Half-life - Wikipedia, Half-life - Wikipedia, the free encyclopedia, 'Half-life (symbol t1⁄2) is the time required for a quantity to reduce to half its initial value. The term is commonly used in nuclear physics to describe how quickly unstable atoms undergo, or how long stable atoms survive, radioactive decay. The term is also used more generally to characterize any type of exponential or non-exponential decay. For example, the medical sciences refer to the biological half-life of drugs and other chemicals in the human body. The converse of half-life is doubling time.' back

Hamilton's principle - Wikipedia, Hamilton's principle - Wikipedia, the free encyclopedia, 'In physics, Hamilton's principle is William Rowan Hamilton's formulation of the principle of stationary action . . . It states that the dynamics of a physical system is determined by a variational problem for a functional based on a single function, the Lagrangian, which contains all physical information concerning the system and the forces acting on it.' back

Heat death of the universe - Wikipedia, Heat death of the universe - Wikipedia, the free encyclopedia, 'The heat death is a possible final state of the universe, in which it has "run down" to a state of no thermodynamic free energy to sustain motion or life. In physical terms, it has reached maximum entropy. The hypothesis of a universal heat death stems from the 1850s ideas of William Thomson (Lord Kelvin) who extrapolated the theory of heat views of mechanical energy loss in nature, as embodied in the first two laws of thermodynamics, to universal operation' back

Helium - Wikipedia, Helium - Wikipedia, the free encyclopedia, 'Helium is a chemical element with symbol He and atomic number 2. It is a colorless, odorless, tasteless, non-toxic, inert, monatomic gas, the first in the noble gas group in the periodic table. Its boiling point is the lowest among all the elements.' back

Cogito ergo sum - Wikipedia, Cogito ergo sum - Wikipedia, the free encyclopedia, 'Cogito ergo sum is a Latin philosophical proposition by René Descartes usually translated into English as "I think, therefore I am". The phrase originally appeared in French as je pense, donc je suis in his Discourse on the Method, so as to reach a wider audience than Latin would have allowed. It appeared in Latin in his later Principles of Philosophy. As Descartes explained, "[W]e cannot doubt of our existence while we doubt … ." ' back

Reflections on the Motive Power of Fire - Wikipedia, Reflections on the Motive Power of Fire - Wikipedia, the free encyclopedia, 'Reflections on the Motive Power of Fire and on Machines Fitted to Develop that Power is a book published in 1824 by French physicist Sadi Carnot. The 118-page book's French title was Réflexions sur la puissance motrice du feu et sur les machines propres à développer cette puissance. It is a significant publication in the history of thermodynamics about a generalized theory of heat engines. back

Inertial frame of reference - Wikipedia, Inertial frame of reference - Wikipedia, the free encyclopedia, 'In physics, an inertial frame of reference (also inertial reference frame or inertial frame or Galilean reference frame or inertial space) is a frame of reference that describes time and space homogeneously, isotropically, and in a time-independent manner' back

Initial singularity - Wikipedia, Initial singularity - Wikipedia, the free encyclopedia, 'The initial singularity was the gravitational singularity of infinite density thought to have contained all of the mass and spacetime of the Universe before quantum fluctuations caused it to rapidly expand in the Big Bang and subsequent inflation, creating the present-day Universe.' back

Intelligent design - Wikipedia, Intelligent design - Wikipedia, the free encyclopedia, ' Intelligent design (ID) is a pseudoscientific argument for the existence of God, presented by its proponents as "an evidence-based scientific theory about life's origins". Proponents claim that "certain features of the universe and of living things are best explained by an intelligent cause, not an undirected process such as natural selection." ID is a form of creationism that lacks empirical support and offers no testable or tenable hypotheses, so it is not science.' back

Isaac Newton, Philosophia Naturalis Principia Mathematica, Latin Edition, Longon 1987: Praefatio: Cum Veteres Mechanicam (uti Author est Pappus) in verum Naturalium nvestigatione maximi fecerint, & recentiores, missis formis substantialibus &qualitatibus occultis, Phænomena Naturæ ad leges Mathematicas revocare ag-gressi sint: Visum est in hoc Tractatu Mathesin excolere quatenus ea ad Philosophiam spectat. . . . ' back

Isaac Newton - Wikipedia, Isaac Newton - Wikipedia, the free encyclopedia, 'Sir Isaac Newton PRS (25 December 1642 – 20 March 1726/27 was an English physicist and mathematician (described in his own day as a "natural philosopher") who is widely recognised as one of the most influential scientists of all time and a key figure in the scientific revolution. His book Philosophiæ Naturalis Principia Mathematica ("Mathematical Principles of Natural Philosophy"), first published in 1687, laid the foundations for classical mechanics.' back

Jeffrey Nicholls, Prolegomenon to Scientific Theology, ' This thesis is an attempt to carry speculative theology beyond the apogee it reached in the medieval work of Thomas Aquinas into the world of empirical science (Aquinas 2019). Since the time of Aquinas, our understanding of the Universe has increased enormously. The ancient theologians not only conceived a perfect God, but they also saw the world as a very imperfect place. Their reaction was to place God outside the world. I will argue that we live in a Universe which approaches infinity in size and complexity, is as perfect as can be, and fulfils all the roles traditionally attributed to God, creator, lawmaker and judge.' back

John Paul II, Truth Cannot Contradict Truth, Address to the Pontifical Academy of Sciences October 22, 1996 , 'Consideration of the method used in the various branches of knowledge makes it possible to reconcile two points of view which would seem irreconcilable. The sciences of observation describe and measure the multiple manifestations of life with increasing precision and correlate them with the time line. The moment of transition to the spiritual cannot be the object of this kind of observation, which nevertheless can discover at the experimental level a series of very valuable signs indicating what is specific to the human being.' back

John Paul II (1983), Code of Canon Law, Canon 252: §3. ' There are to be classes in dogmatic theology, always grounded in the written word of God together with sacred tradition; through these, students are to learn to penetrate more intimately the mysteries of salvation, especially with St. Thomas as a teacher. There are also to be classes in moral and pastoral theology, canon law, liturgy, ecclesiastical history, and other auxiliary and special disciplines, according to the norm of the prescripts of the program of priestly formation.' back

John von Neumann, The Mathematical Foundations of Quantum Mechanics, ' Mathematical Foundations of Quantum Mechanics by John von Neumann translated from the German by Robert T. Beyer (New Edition) edited by Nicholas A. Wheeler. Princeton UP Princeton & Oxford. Preface: ' This book is the realization of my long-held intention to someday use the resources of TEX to produce a more easily read version of Robert T. Beyer’s authorized English translation (Princeton University Press, 1955) of John von Neumann’s classic Mathematische Grundlagen der Quantenmechanik (Springer, 1932).'This content downloaded from 129.127.145.240 on Sat, 30 May 2020 22:38:31 UTC back

Jose Wudka, Precession of the perihelion of Mercury, 'A long-standing problem in the study of the Solar System was that the orbit of Mercury did not behave as required by Newton's equations. . . . There is a discrepancy of 43 seconds of arc per century. . . . Einstein was able to predict, without any adjustments whatsoever, that the orbit of Mercury should precess by an extra 43 seconds of arc per century should the General Theory of Relativity be correct.' back

Karim Shaheen, March for Turkey's jailed judges highights purge on dissidents, ' Interviews with former members of the judiciary and their families, legal experts, defence counsels and senior lawmakers, reveal a broad and systematic attempt at intimidating and reshaping Turkey’s judicial branch in an effort to further consolidate power in the hands of the ruling AKP party and Turkey’s president, Recep Tayyip Erdoğan. . . . Tens of thousands of people have been arrested or dismissed from their jobs in the civil service, military, judiciary, academia and media, in a broad crackdown that the government says is aimed at followers of Fethullah Gülen, an exiled preacher whose movement is widely believed to have been behind the coup attempt last July. But that purge has gone beyond the alleged perpetrators to encompass dissidents of all stripes, including senior opposition lawmakers.' back

Kenneth G Wilson, The Renormalisation Group and Critical Phenomena, Nobel Prize Lecture, 8 December 1982: This paper has three parts. The first part is a simplified presentation of the basic ideas of the renormalization group and the e expansion applied to critical phenomena, following roughly a summary exposition given in 1972. The second part is an account of the history (as I remember it) of work leading up to the papers in I971-1972 on the renormalization group. Finally, some of the developments since 1971 will be summarized, and an assessment for the future given.' back

Kerson Huang, A Critical History of Renormalization, ' The history of renormalization is reviewed with acritical eye,starting with Lorentz's theory of radiation damping, through perturbative QED withDyson,Gell‐Mann & Low ,and others,to Wilson's formulation and Polchinski's functional equation, and applications to "triviality", and dark energy incosmology.' back

Kirchoff's law of thermal radiation - Wikipedia, Kirchoff's law of thermal radiation - Wikipedia, the free encyclopedia, 'Kirchhoff's law states that: For a body of any arbitrary material, emitting and absorbing thermal electromagnetic radiation at every wavelength in thermodynamic equilibrium, the ratio of its emissive power to its dimensionless coefficient of absorption is equal to a universal function only of radiative wavelength and temperature, the perfect black-body emissive power. back

Lagrangian - Wikipedia, Lagrangian - Wikipedia, the free encyclopedia, 'The Lagrangian, L, of a dynamical system is a function that summarizes the dynamics of the system. It is named after Joseph Louis Lagrange. The concept of a Lagrangian was originally introduced in a reformulation of classical mechanics by Irish mathematician William Rowan Hamilton known as Lagrangian mechanics. In classical mechanics, the Lagrangian is defined as the kinetic energy, T, of the system minus its potential energy, V. In symbols, L = T - V. ' back

Lie Group - Wikipedia, Lie Group - Wikipedia, the free encyclopedia, 'In mathematics, a Lie group . . . is a group that is also a differentiable manifold, with the property that the group operations are compatible with the smooth structure. Lie groups are named after Norwegian mathematician Sophus Lie, who laid the foundations of the theory of continuous transformation groups. Lie groups represent the best-developed theory of continuous symmetry of mathematical objects and structures, which makes them indispensable tools for many parts of contemporary mathematics, as well as for modern theoretical physics. . . . One of the key ideas in the theory of Lie groups is to replace the global object, the group, with its local or linearized version, which Lie himself called its "infinitesimal group" and which has since become known as its Lie algebra.' back

Logos (Christianity) - Wikipedia, Logos (Christianity) - Wikipedia, the free encyclopedia, 'In Christology, the conception that the Christ is the Logos (Λόγος, the Greek for "word", "discourse" or "reason") has been important in establishing the doctrine of the divinity of Jesus Christ and his position as God the Son in the Trinity as set forth in the Chalcedonian Creed. . . . The conception derives from the opening of the Gospel of John, commonly translated into English as: "In the beginning was the Word, and the Word was with God, and the Word was God." In the original Greek, Logos (λόγος) is used for "Word," and in theological discourse, this is often left untranslated.' back

Lorentz transformation - Wikipedia, Lorentz transformation - Wikipedia, the free encyclopedia, 'In physics, the Lorentz transformation or Lorentz-Fitzgerald transformation describes how, according to the theory of special relativity, two observers' varying measurements of space and time can be converted into each other's frames of reference. It is named after the Dutch physicist Hendrik Lorentz. It reflects the surprising fact that observers moving at different velocities may measure different distances, elapsed times, and even different orderings of events.' back

Ludwig Boltzmann - Wikipedia, Ludwig Boltzmann - Wikipedia, the free encyclopedia, Ludwig Eduard Boltzmann (February 20, 1844 – September 5, 1906) was an Austrian physicist and philosopher whose greatest achievement was in the development of statistical mechanics, which explains and predicts how the properties of atoms (such as mass, charge, and structure) determine the physical properties of matter (such as viscosity, thermal conductivity, and diffusion). back

Magisterium - Wikipedia, Magisterium - Wikipedia, the free encyclopedia, 'The magisterium of the Catholic Church is the church's authority or office to establish its own authentic teachings. That authority is vested uniquely by the pope and by the bishops, under the premise that they are in communion with the correct and true teachings of the faith. Sacred scripture and sacred tradition "make up a single sacred deposit of the Word of God, which is entrusted to the Church", and the magisterium is not independent of this, since "all that it proposes for belief as being divinely revealed is derived from this single deposit of faith." ' back

Many-worlds interpretation - Wikipedia, Many-worlds interpretation - Wikipedia, the free encyclopedia, ' The many-worlds interpretation (MWI) is an interpretation of quantum mechanics that asserts that the universal wavefunction is objectively real, and that there is no wavefunction collapse. This implies that all possible outcomes of quantum measurements are physically realized in some "world" or universe.' back

Martin Luther - Wikipedia, Martin Luther - Wikipedia, the free encyclopedia, 'Martin Luther (10 November 1483 – 18 February 1546) was a German priest, professor of theology and iconic figure of the Protestant Reformation.He strongly disputed the claim that freedom from God's punishment for sin could be purchased with money. He confronted indulgence salesman Johann Tetzel with his Ninety-Five Theses in 1517. His refusal to retract all of his writings at the demand of Pope Leo X in 1520 and the Holy Roman Emperor Charles V at the Diet of Worms in 1521 resulted in his excommunication by the pope and condemnation as an outlaw by the Emperor.' back

Mathematical proof - Wikipedia, Mathematical proof - Wikipedia, the free encyclopedia, 'In mathematics, a proof is an inferential argument for a mathematical statement. In the argument, other previously established statements, such as theorems, can be used. In principle, a proof can be traced back to self-evident or assumed statements, known as axioms, along with accepted rules of inference.' back

Matrix mechanics - Wikipedia, Matrix mechanics - Wikipedia, the free encyclopedia, 'Matrix mechanics is a formulation of quantum mechanics created by Werner Heisenberg, Max Born, and Pascual Jordan in 1925. Matrix mechanics was the first conceptually autonomous and logically consistent formulation of quantum mechanics. It extended the Bohr Model by describing how the quantum jumps occur. It did so by interpreting the physical properties of particles as matrices that evolve in time. It is equivalent to the Schrödinger wave formulation of quantum mechanics, and is the basis of Dirac's bra-ket notation for the wave function. back

Maupertuis' principle - Wikipedia, Maupertuis' principle - Wikipedia, the free encyclopedia, 'In classical mechanics, Maupertuis' principle (named after Pierre Louis Maupertuis) is an integral equation that determines the path followed by a physical system without specifying the time parameterization of that path. It is a special case of the more generally stated principle of least action. More precisely, it is a formulation of the equations of motion for a physical system not as differential equations, but as an integral equation, using the calculus of variations.' back

Measurement in quantum mechanics - Wikipedia, Measurement in quantum mechanics - Wikipedia, the free encyclopedia, 'The framework of quantum mechanics requires a careful definition of measurement. The issue of measurement lies at the heart of the problem of the interpretation of quantum mechanics, for which there is currently no consensus.' back

Measurement problem - Wikipedia, Measurement problem - Wikipedia, the free encyclopedia, 'The measurement problem in quantum mechanics is the problem of how (or whether) wave function collapse occurs. The inability to observe this process directly has given rise to different interpretations of quantum mechanics, and poses a key set of questions that each interpretation must answer. The wave function in quantum mechanics evolves deterministically according to the Schrödinger equation as a linear superposition of different states, but actual measurements always find the physical system in a definite state. Any future evolution is based on the state the system was discovered to be in when the measurement was made, meaning that the measurement "did something" to the system that is not obviously a consequence of Schrödinger evolution.' back

Michael Tooley (Stanford Encyclopedia of Philosophy), The Problem of Evil, 'The epistemic question posed by evil is whether the world contains undesirable states of affairs that provide the basis for an argument that makes it unreasonable to believe in the existence of God.' back

Minkowski space - Wikipedia, Minkowski space - Wikipedia, the free encyclopedia, ' In mathematical physics, Minkowski space or Minkowski spacetime is a combination of Euclidean space and time into a four-dimensional manifold where the spacetime interval between any two events is independent of the inertial frame of reference in which they are recorded. Although initially developed by mathematician Hermann Minkowski for Maxwell's equations of electromagnetism, the mathematical structure of Minkowski spacetime was shown to be an immediate consequence of the postulates of special relativity.' back

Nucleic acid - Wikipedia, Nucleic acid - Wikipedia, the free encyclopedia, ' Nucleic acids are the biopolymers, or small biomolecules, essential to all known forms of life. The term nucleic acid is the overall name for DNA and RNA. They are composed of nucleotides, which are the monomers made of three components: a 5-carbon sugar, a phosphate group and a nitrogenous base. If the sugar is a compound ribose, the polymer is RNA (ribonucleic acid); if the sugar is derived from ribose as deoxyribose, the polymer is DNA (deoxyribonucleic acid).' back

Nucleosynthesis - Wikipedia, Nucleosynthesis - Wikipedia, the free encyclopedia, 'Nucleosynthesis is the process that creates new atomic nuclei from pre-existing nucleons, primarily protons and neutrons. The first nuclei were formed about three minutes after the Big Bang, through the process called Big Bang nucleosynthesis. It was then that hydrogen and helium formed to become the content of the first stars, and is responsible for the present hydrogen/helium ratio of the cosmos.' back

Observable Universe - Wikipedia, Observable Universe - Wikipedia, the free encyclopedia, 'The observable universe is a spherical region of the universe comprising all matter that can be observed from Earth or its space-based telescopes and exploratory probes at the present time, because electromagnetic radiation from these objects has had time to reach the Solar System and Earth since the beginning of the cosmological expansion. There are at least 2 trillion galaxies in the observable universe.' back

P versus NP problem - Wikipedia, P versus NP problem - Wikipedia, the free encyclopedia, 'The P versus NP problem is a major unsolved problem in computer science. It asks whether every problem whose solution can be quickly verified (technically, verified in polynomial time) can also be solved quickly (again, in polynomial time). The underlying issues were first discussed in the 1950s, in letters from John Forbes Nash Jr. to the National Security Agency, and from Kurt Gödel to John von Neumann. The precise statement of the P versus NP problem was introduced in 1971 by Stephen Cook in his seminal paper "The complexity of theorem proving procedures" and is considered by many to be the most important open problem in the field.' back

P. Lumbreras (Part I), The Twenty-Four Fundamental Theses of Official Catholic Philosophy, ' Nobody can deny that the Church has full authority to regulate the teaching of philosophy in Catholic educational institutions. Pope Leo XIII said: "The only-begotten Son of the Eternal Father, who came on earth to bring salvation and the light of divine wisdom to men, conferred a great and wonderful blessing on the world when, about to ascend again into heaven, He commanded the Apostles to go and teach all nations, and left the Church which He had founded to be the common and supreme teacher of the peoples." back

P. Lumbreras (Part II), The Twenty-Four Fundamental Theses of Official Catholic Philosophy, ' In our preceding paper we proved by documents of recent Popes that the Church, in exercising her right, has adopted the scholastic philosophy as her official philosophical teaching, that by scholastic philosophy the Church understands not only chiefly but exclusively the philosophy of St. Thomas, and that St. Thomas' philosophy stands for at least the twenty-four theses approved and published by the Sacred Congregation of Studies. In this paper we will give a translation of these theses with a very brief explanation of each.' back

Path integral formulation - Wikipedia, Path integral formulation - Wikipedia, the free encyclopedia, 'The path integral formulation of quantum mechanics is a description of quantum theory which generalizes the action principle of classical mechanics. It replaces the classical notion of a single, unique trajectory for a system with a sum, or functional integral, over an infinity of possible trajectories to compute a quantum amplitude. . . . This formulation has proved crucial to the subsequent development of theoretical physics, since it provided the basis for the grand synthesis of the 1970s which unified quantum field theory with statistical mechanics. . . . ' back

Paul Helm, Eternity (Stanford Encyclopedia of Philosophy), 'Concepts of eternity have developed in a way that is, as a matter of fact, closely connected to the development of the concept of God in Western thought, beginning with ancient Greek philosophers; particularly to the idea of God's relation to time, the idea of divine perfection, and the Creator-creature distinction.' back

Pauli exclusion principle - Wikipedia, Pauli exclusion principle - Wikipedia, the free encyclopedia, 'The Pauli exclusion principle is the quantum mechanical principle that no two identical fermions (particles with half-integer spin) may occupy the same quantum state simultaneously. A more rigorous statement is that the total wave function for two identical fermions is anti-symmetric with respect to exchange of the particles. The principle was formulated by Austrian physicist Wolfgang Pauli in 1925.' back

Phase (matter) - Wikipedia, Phase (matter) - Wikipedia, the free encyclopedia, 'In the physical sciences, a phase is a region of space (a thermodynamic system), throughout which all physical properties of a material are essentially uniform.[1][2]:86[3]:3 Examples of physical properties include density, index of refraction, magnetization and chemical composition. A simple description is that a phase is a region of material that is chemically uniform, physically distinct, and (often) mechanically separable. In a system consisting of ice and water in a glass jar, the ice cubes are one phase, the water is a second phase, and the humid air over the water is a third phase. The glass of the jar is another separate phase.' back

Pius X: On the doctrines of the modernists, Pascendi dominici gregis, '2. That We make no delay in this matter is rendered necessary especially by the fact that the partisans of error are to be sought not only among the Church's open enemies; they lie hid, a thing to be deeply deplored and feared, in her very bosom and heart, and are the more mischievous, the less conspicuously they appear. We allude, Venerable Brethren, to many who belong to the Catholic laity, nay, and this is far more lamentable, to the ranks of the priesthood itself, who, feigning a love for the Church, lacking the firm protection of philosophy and theology, nay more, thoroughly imbued with the poisonous doctrines taught by the enemies of the Church, and lost to all sense of modesty, vaunt themselves as reformers of the Church; and, forming more boldly into line of attack, assail all that is most sacred in the work of Christ, not sparing even the person of the Divine Redeemer, whom, with sacrilegious daring, they reduce to a simple, mere man.' back

Planck constant - Wikipedia, Planck constant - Wikipedia, the free encyclopedia, ' Since energy and mass are equivalent, the Planck constant also relates mass to frequency. By 2017, the Planck constant had been measured with sufficient accuracy in terms of the SI base units, that it was central to replacing the metal cylinder, called the International Prototype of the Kilogram (IPK), that had defined the kilogram since 1889. . . . For this new definition of the kilogram, the Planck constant, as defined by the ISO standard, was set to 6.626 070 150 × 10-34 J⋅s exactly. ' back

Platonic epistemology - Wikipedia, Platonic epistemology - Wikipedia, the free encyclopedis, 'Platonic epistemology holds that knowledge of Platonic Ideas is innate, so that learning is the development of ideas buried deep in the soul, often under the midwife-like guidance of an interrogator. In several dialogues by Plato, the character Socrates presents the view that each soul existed before birth with the Form of the Good and a perfect knowledge of Ideas. Thus, when an Idea is "learned" it is actually just "recalled".' back

Pope John Paul II, The Catechism of the Catholic Church, The text of the Apostolic Constitution Fidei Depositum Prologue: '... 11 This catechism aims at presenting an organic synthesis of the essential and fundamental contents of Catholic doctrine, as regards both faith and morals, in the light of the Second Vatican Council and the whole of the Church's Tradition. Its principal sources are the SacredScriptures, the Fathers of the Church, the liturgy, and the Church's Magisterium. It is intended to serve "as a point of reference for the catechisms or compendia that are composed in the various countries. ...' back

Power (philosophy) - Wikipedia, Power (philosophy) - Wikipedia, the free encyclopedia, 'Power is a measurement of an entity's ability to control its environment, including the behavior of other entities. The term authority is often used for power perceived as legitimate by the social structure. Power can be seen as evil or unjust, but the exercise of power is accepted as endemic to humans as social beings. In the corporate environment, power is often expressed as upward or downward. With downward power, a company's superior influences subordinates. When a company exerts upward power, it is the subordinates who influence the decisions of the leader (Greiner & Schein, 1988). Often, the study of power in a society is referred to as politics.' back

Precision tests of QED - Wikipedia, Precision tests of QED - Wikipedia, the free encyclopedia, 'Quantum electrodynamics (QED), a relativistic quantum field theory of electrodynamics, is among the most stringently tested theories in physics. . . . Tests of a theory are normally carried out by comparing experimental results to theoretical predictions. . . . The agreement found this way is to within ten parts in a billion . . . . This makes QED one of the most accurate physical theories constructed thus far.' back

Propaganda - Wikipedia, Propaganda - Wikipedia, the free encyclopedia, ' Propaganda is communication that is used primarily to influence an audience and further an agenda, which may not be objective and may be presenting facts selectively to encourage a particular synthesis or perception, or using loaded language to produce an emotional rather than a rational response to the information that is presented.[2] Propaganda is often associated with material prepared by governments, but activist groups, companies, religious organizations, the media, and individuals can also produce propaganda. In the 20th century, the term propaganda has often been associated with a manipulative approach, but propaganda historically is a neutral descriptive term.' back

Pure mathematics - Wikipedia, Pure mathematics - Wikipedia, the free encyclopedia, 'Broadly speaking, pure mathematics is mathematics that studies entirely abstract concepts. This was a recognizable category of mathematical activity from the 19th century onwards, at variance with the trend towards meeting the needs of navigation, astronomy, physics, economics, engineering, and so on.' back

Pythagorean theorem - Wikipedia, Pythagorean theorem - Wikipedia, the free encyclopedia, 'In any right triangle, the area of the square whose side is the hypotenuse (the side opposite the right angle) is equal to the sum of the areas of the squares whose sides are the two legs (the two sides that meet at a right angle). This is usually summarized as: The square on the hypotenuse is equal to the sum of the squares on the other two sides.' back

Quantum chromodynamics - Wikipedia, Quantum chromodynamics - Wikipedia, the free encyclopedia, ' In theoretical physics, quantum chromodynamics (QCD) is the theory of the strong interaction between quarks and gluons, the fundamental particles that make up composite hadrons such as the proton, neutron and pion. QCD is a type of quantum field theory called a non-abelian gauge theory, with symmetry group SU(3). The QCD analog of electric charge is a property called color. Gluons are the force carrier of the theory, like photons are for the electromagnetic force in quantum electrodynamics.' back

Quantum computing - Wikipedia, Quantum computing - Wikipedia, the free encyclopedia, 'Quantum computing studies theoretical computation systems (quantum computers) that make direct use of quantum-mechanical phenomena, such as superposition and entanglement, to perform operations on data.' back

Quantum field theory - Wikipedia, Quantum field theory - Wikipedia, the free encyclopedia, 'Quantum field theory (QFT) provides a theoretical framework for constructing quantum mechanical models of systems classically described by fields or (especially in a condensed matter context) of many-body systems. . . . In QFT photons are not thought of as 'little billiard balls', they are considered to be field quanta - necessarily chunked ripples in a field that 'look like' particles. Fermions, like the electron, can also be described as ripples in a field, where each kind of fermion has its own field. In summary, the classical visualisation of "everything is particles and fields", in quantum field theory, resolves into "everything is particles", which then resolves into "everything is fields". In the end, particles are regarded as excited states of a field (field quanta). back

Quantum superposition - Wikipedia, Quantum superposition - Wikipedia, the free encyclopedia, 'Quantum superposition is the application of the superposition principle to quantum mechanics. The superposition principle is the addition of the amplitudes of waves from interference. In quantum mechanics it is the sum of wavefunction amplitudes, or state vectors. It occurs when an object simultaneously "possesses" two or more possible values for an observable quantity (e.g. the position or energy of a particle)' back

Rafael Sanjuán et al, Viral Mutation Rates, ' Accurate estimates of virus mutation rates are important to understand the evolution of the viruses and to combat them. However, methods of estimation are varied and often complex. Here, we critically review over 40 original studies and establish criteria to facilitate comparative analyses. The mutation rates of 23 viruses are presented as substitutions per nucleotide per cell infection (s/n/c) and corrected for selection bias where necessary, using a new statistical method. The resulting rates range from 10−8 to10−6 s/n/c for DNA viruses and from 10−6 to 10−4 s/n/c for RNA viruses.' back

Redshift - Wikipedia, Redshift - Wikipedia, the free encyclopedia, 'A redshift occurs whenever a light source moves away from an observer. A special instance of this is the cosmological redshift, which is due to the expansion of the universe, and sufficiently distant light sources (generally more than a few million light years away) show redshift corresponding to the rate of increase in their distance from Earth.' back

Renormalization - Wikipedia, Renormalization - Wikipedia, the free encyclopedia, ' Renormalization is a collection of techniques in quantum field theory, the statistical mechanics of fields, and the theory of self-similar geometric structures, that are used to treat infinities arising in calculated quantities by altering values of quantities to compensate for effects of their self-interactions. But even if it were the case that no infinities arose in loop diagrams in quantum field theory, it could be shown that renormalization of mass and fields appearing in the original Lagrangian is necessary.' back

Richard P. Feynman, Nobel Lecture: The Development of the Space-Time View of Quantum Electrodynamics, Nobel Lecture, December 11, 1965: 'We have a habit in writing articles published in scientific journals to make the work as finished as possible, to cover all the tracks, to not worry about the blind alleys or to describe how you had the wrong idea first, and so on. So there isn't any place to publish, in a dignified manner, what you actually did in order to get to do the work, although, there has been in these days, some interest in this kind of thing. Since winning the prize is a personal thing, I thought I could be excused in this particular situation, if I were to talk personally about my relationship to quantum electrodynamics, rather than to discuss the subject itself in a refined and finished fashion. Furthermore, since there are three people who have won the prize in physics, if they are all going to be talking about quantum electrodynamics itself, one might become bored with the subject. So, what I would like to tell you about today are the sequence of events, really the sequence of ideas, which occurred, and by which I finally came out the other end with an unsolved problem for which I ultimately received a prize.' back

Rolf Landauer, Information is a Physical Entity, 'Abstract: This paper, associated with a broader conference talk on the fundamental physical limits of information handling, emphasizes the aspects still least appreciated. Information is not an abstract entity but exists only through a physical representation, thus tying it to all the restrictions and possibilities of our real physical universe. The mathematician's vision of an unlimited sequence of totally reliable operations is unlikely to be implementable in this real universe. Speculative remarks about the possible impact of that on the ultimate nature of the laws of physics are included.' back

Schrödinger equation - Wikipedia, Schrödinger equation - Wikipedia, the free encyclopedia, 'IIn quantum mechanics, the Schrödinger equation is a partial differential equation that describes how the quantum state of a quantum system changes with time. It was formulated in late 1925, and published in 1926, by the Austrian physicist Erwin Schrödinger. . . . In classical mechanics Newton's second law, (F = ma), is used to mathematically predict what a given system will do at any time after a known initial condition. In quantum mechanics, the analogue of Newton's law is Schrödinger's equation for a quantum system (usually atoms, molecules, and subatomic particles whether free, bound, or localized). It is not a simple algebraic equation, but in general a linear partial differential equation, describing the time-evolution of the system's wave function (also called a "state function").' back

Shield of the Trinity - Wikipedia, Shield of the Trinity - Wikipedia, the free encyclopedia, 'The Shield of the Trinity or Scutum Fidei is a traditional Christian visual symbol which expresses many aspects of the doctrine of the Trinity, summarizing the first part of the Athanasian Creed in a compact diagram. In late medieval England and France, this emblem was considered to be the heraldic arms of God (and of the Trinity).' back

Silvan S. Schweber, The sources of Schwinger's Green's functions, ' QED, Fermi's theory of β-decay, and Yukawa's theory of nuclear forces established the model on which subsequent developments were based. It postulated “impermanent” particles to account for interactions and assumed that relativistic QFT was the proper framework for representing processes at ever smaller distances. But relativistic QFTs are beset by divergence difficulties manifested in perturbative calculations beyond the lowest order. Higher orders yield infinite results. These difficulties stem from the fact that (i) the description is in term of local fields (i.e., fields that are defined at a point in space-time), which are systems with an infinite number of degrees of freedom, and (ii) the interaction between fields is assumed to be local.' back

Spin-statistics theorem - Wikipedia, Spin-statistics theorem - Wikipedia, the free encyclopedia, 'In quantum mechanics, the spin–statistics theorem relates the spin of a particle to the particle statistics it obeys. The spin of a particle is its intrinsic angular momentum (that is, the contribution to the total angular momentum that is not due to the orbital motion of the particle). All particles have either integer spin or half-integer spin (in units of the reduced Planck constant ħ). The theorem states that: The wave function of a system of identical integer-spin particles has the same value when the positions of any two particles are swapped. Particles with wave functions symmetric under exchange are called bosons. The wave function of a system of identical half-integer spin particles changes sign when two particles are swapped. Particles with wave functions antisymmetric under exchange are called fermions.' back

Standard model - Wikipedia, Standard model - Wikipedia, the free encyclopedia, 'The Standard Model of particle physics is a theory that describes three of the four known fundamental interactions between the elementary particles that make up all matter. It is a quantum field theory developed between 1970 and 1973 which is consistent with both quantum mechanics and special relativity. To date, almost all experimental tests of the three forces described by the Standard Model have agreed with its predictions. However, the Standard Model falls short of being a complete theory of fundamental interactions, primarily because of its lack of inclusion of gravity, the fourth known fundamental interaction, but also because of the large number of numerical parameters (such as masses and coupling constants) that must be put "by hand" into the theory (rather than being derived from first principles) . . . ' back

Statistical mechanics - Wikipedia, Statistical mechanics - Wikipedia, the free encyclopedia, 'Statistical mechanics (or statistical thermodynamics is the application of probability theory, which includes mathematical tools for dealing with large populations, to the field of mechanics, which is concerned with the motion of particles or objects when subjected to a force. . . . The essential problem in statistical thermodynamics is to determine the distribution of a given amount of energy E over N identical systems. The goal of statistical thermodynamics is to understand and to interpret the measurable macroscopic properties of materials in terms of the properties of their constituent particles and the interactions between them. This is done by connecting thermodynamic functions to quantum-mechanic equations. Two central quantities in statistical thermodynamics are the Boltzmann factor and the partition function.' back

Stellar evolution - Wikipedia, Stellar evolution - Wikipedia, the free encyclopedia, 'Stellar evolution is the process by which a star changes during its lifetime. Depending on the mass of the star, this lifetime ranges from a few million years for the most massive to trillions of years for the least massive, which is considerably longer than the age of the universe.' back

Steven Weinberg, Einstein's Mistakes , ' The difficulty is not that quantum mechanics is probabilistic—that is something we apparently just have to live with. The real difficulty is that it is also deterministic, or more precisely, that it combines a probabilistic interpretation with deterministic dynamics.' back

Stochastic - Wikipedia, Stochastic - Wikipedia, the free encyclopedia, 'The word stochastic is an adjective in English that describes something that was randomly determined.The word first appeared in English to describe a mathematical object called a stochastic process, but now in mathematics the terms stochastic process and random process are considered interchangeable.The word, with its current definition meaning random, came from German, but it originally came from the Greek word στόχος (stokhos, "aim").' back

Strong interaction - Wikipedia, Strong interaction - Wikipedia, the free encyclopedia, 'The strong nuclear force holds most ordinary matter together because it confines quarks into hadron particles such as the proton and neutron. In addition, the strong force binds neutrons and protons to create atomic nuclei. Most of the mass of a common proton or neutron is the result of the strong force field energy; the individual quarks provide only about 1% of the mass of a proton.' back

Tensor product - Wikipedia, Tensor product - Wikipedia, the free encyclopedia, ' n mathematics, the tensor product V ⊗ W of two vector spaces V and W (over the same field) is itself a vector space, endowed with the operation of bilinear composition, denoted by ⊗, from ordered pairs in the Cartesian product V × W to V ⊗ W in a way that generalizes the outer product. Essentially the difference between a tensor product of two vectors and an ordered pair of vectors is that if one vector is multiplied by a nonzero scalar and the other is multiplied by the reciprocal of that scalar, the result is a different ordered pair of vectors, but the same tensor product of two vectors. The tensor product of V and W is the vector space generated by the symbols v ⊗ w, with v ∈ V and w ∈ W, in which the relations of bilinearity are imposed for the product operation ⊗, and no other relations are assumed to hold. The tensor product space is thus the "freest" (or most general) such vector space, in the sense of having the fewest constraints. ' back

The Illustris Collaboration, The Illustris Simulation, 'The Illustris project is a large cosmological simulation of galaxy formation, completed in late 2013, using a state of the art numerical code and a comprehensive physical model. Building on several years of effort by members of the collaboration, the Illustris simulation represents an unprecedented combination of high resolution, total volume, and physical fidelity.' back

Thomas Aquinas, Summa, I, 2, 3, Does God exist?, 'I answer that, The existence of God can be proved in five ways. The first and more manifest way is the argument from motion. . . . ' back

Thomas Robert Malthus - Wikipedia, Thomas Robert Malthus - Wikipedia, the free encyclopedia, 'The Reverend Thomas Robert Malthus FRS (February 1766 – December 1834) was an English scholar, influential in political economy and demography. Malthus popularized the economic theory of rent. Malthus has become widely known for his theories about population and its increase or decrease in response to various factors. The six editions of his An Essay on the Principle of Population, published from 1798 to 1826, observed that sooner or later population gets checked by famine and disease. ' back

Time-division multiplexing - Wikipedia, Time-division multiplexing - Wikipedia, the free encyclopedia, 'Time-division multiplexing (TDM) is a method of transmitting and receiving independent signals over a common signal path by means of synchronized switches at each end of the transmission line so that each signal appears on the line only a fraction of time in an alternating pattern.' back

Trinity - Wikipedia, Trinity - Wikipedia, the free encyclopedia, 'The Christian doctrine of the Trinity (from Latin trinitas "triad", from trinus "threefold") defines God as three consubstantial persons, expressions, or hypostases: the Father, the Son (Jesus Christ), and the Holy Spirit; "one God in three persons". The three persons are distinct, yet are one "substance, essence or nature" homoousios). In this context, a "nature" is what one is, while a "person" is who one is.' back

Triple-alpha process - Wikipedia, Triple-alpha process - Wikipedia, the free encyclopedia, 'The triple-alpha process is a set of nuclear fusion reactions by which three helium-4 nuclei (alpha particles) are transformed into carbon.' back

Uncertainty principle - Wikipedia, Uncertainty principle - Wikipedia, the free encyclopedia, 'In quantum mechanics, the uncertainty principle, also known as Heisenberg's uncertainty principle, is any of a variety of mathematical inequalities asserting a fundamental limit to the precision with which certain pairs of physical properties of a particle, known as complementary variables, such as position x and momentum p, can be known.' back

Vacuum - Wikipedia, Vacuum - Wikipedia, the free encyclopedia, 'According to modern understanding, even if all matter could be removed from a volume, it would still not be "empty" due to vacuum fluctuations, dark energy, transiting gamma rays, cosmic rays, neutrinos, and other phenomena in quantum physics. In the electromagnetism in the 19th century, vacuum was thought to be filled with a medium called aether. In modern particle physics, the vacuum state is considered the ground state of matter.' back

Wave function collapse - Wikipedia, Wave function collapse - Wikipedia, the free encyclopedia, 'In quantum mechanics, wave function collapse is said to occur when a wave function—initially in a superposition of several eigenstates—appears to reduce to a single eigenstate (by "observation"). It is the essence of measurement in quantum mechanics and connects the wave function with classical observables like position and momentum. Collapse is one of two processes by which quantum systems evolve in time; the other is continuous evolution via the Schrödinger equation.' back

Weak interaction - Wikipedia, Weak interaction - Wikipedia, the free encyclopedia, 'In particle physics, the weak interaction (the weak force or weak nuclear force) is one of the four known fundamental interactions of nature, alongside the strong interaction, electromagnetism, and gravitation. The weak interaction is responsible for radioactive decay, which plays an essential role in nuclear fission.' back

Werner Heisenberg, Quantum-theoretical re-interpretation of kinematic and mechanical relations, 'The present paper seeks to establish a basis for theoretical quantum mechanics founded exclusively upon relationships between quantities which in principle are observable.' back

Wojciech Hubert Zurek, Quantum origin of quantum jumps: breaking of unitary symmetry induced by information transfer and the transition from quantum to classical, 'Submitted on 17 Mar 2007 (v1), last revised 18 Mar 2008 (this version, v3)) Measurements transfer information about a system to the apparatus, and then further on – to observers and (often inadvertently) to the environment. I show that even imperfect copying essential in such situations restricts possible unperturbed outcomes to an orthogonal subset of all possible states of the system, thus breaking the unitary symmetry of its Hilbert space implied by the quantum superposition principle. Preferred outcome states emerge as a result. They provide framework for the “wavepacket collapse”, designating terminal points of quantum jumps, and defining the measured observable by specifying its eigenstates.' back

Zero-energy universe - Wikipedia, Zero-energy universe - Wikipedia, the free encyclopedia, 'The zero-energy universe hypothesis proposes that the total amount of energy in the universe is exactly zero: its amount of positive energy in the form of matter is exactly canceled out by its negative energy in the form of gravity. . . . The zero-energy universe theory originated in 1973, when Edward Tryon proposed in the journal Nature that the universe emerged from a large-scale quantum fluctuation of vacuum energy, resulting in its positive mass-energy being exactly balanced by its negative gravitational potential energy.' back

Zeroth law of thermodynamics - Wikipedia, Zeroth law of thermodynamics - Wikipedia, the free encyclopedia, ' The zeroth law of thermodynamics states that if two thermodynamic systems are each in thermal equilibrium with a third one, then they are in thermal equilibrium with each other. Accordingly, thermal equilibrium between systems is a transitive relation. . . . The physical meaning is expressed by Maxwell in the words: "All heat is of the same kind". Another statement of the law is "All diathermal walls are equivalent" back

www.scientific_theology.com is maintained by The Theology Company Proprietary Limited ACN 097 887 075 ABN 74 097 887 075 Copyright 2000-2021 © Jeffrey Nicholls