scientific theology

This book is part of a project to develop a new scientific and democratic foundation for a catholic theology to replace current theological hypotheses

Contact us: Click to email
Scientific theology: a new theological hypothesis

Chapter 5: A network model

5.1: Human networking: face to face
5.2: Human networking: telecommunication
5.3: The network "atom"
5.4: Noise: the mathematical theory of communication
5.5: From the Turing machine to electronic computers
5.6: Two worlds: Classical and quantum
5.7: Finite classical computer networks
5.8: A transfinite computer network I: formalism
5.9: A transfinite computer network II: dynamics
5.10: Quantum mechanics describes a computation network
5.11: Complementarity: Measurement and meaning
5.12: Visibility and invisibility
5.13: Network intelligence: collective mind

5.1: Human networking: face to face

Everywhere we look there are networks. We are mostly very social beings. During the course of any day we are likely to spend hours meeting and talking to many different people about all sorts of things. Some of these people may be family and friends, some intimate companions, some workmates, some friends, some even enemies. Our interactions are not always friendly or peaceful.

We evolved in Africa about 300 000 years ago. We have been a very successful species. As our population has increased we have occupied increasing areas of land to harvest the resources necessary to maintain our growing population. About 200 000 years ago some Homo sapiens left Africa and have since spread to almost every part of the planet. For most of this history all our dealings were face to face, in the family, village or battlefield. If we wanted to communicate with distant people, it was necessary to travel to meet them. Only in the last few thousand years have we been able to communicate at a distance, first through writings travelling by post, and in the last few centuries by electromagnetic technologies, wired and wireless. Klein: The Human Career, Carl Zimmer: A single migration from Africa populated the world, Beth Blaxland & Fran Dorey: The first migrations out of Africa

Our interactions occur through many channels, all of which are physical and so may be called body language. Here we are communicating through written English, a representation of the spoken language communicated through fingers. Under different circumstances we might speak to one another, see, hear or touch one another, exchange goods and money, dance, kiss or mate. The hardware for all these channels of communication lies within our bodies, which are built with the physical layer of the Universe. Each of us is a huge network of electromagnetic nanomachines, molecules described by quantum electrodynamics (QED). QED describes all the physical and chemical interactions that make our lives possible. Richard Feynman: QED, The Strange Story of Light and Matter

We can model our social lives with communication networks. Since we are born into networks and built from them we understand them intuitively. We can develop a more formal picture of what is happening by considering engineered networks like postal systems, telephones and the internet. Finally, we can get a mathematical grip on the nature of network communication using Claude Shannon's mathematical theory of communication and Alan Turing's theory of computation. Claude E Shannon: A Mathematical Theory of Communication, Alan Turing: On Computable Numbers, with an application to the Entscheidungsproblem

This theory provides us with an abstract network model of the classical Universe. Then, in subsequent chapters, we will work through the quantum, cosmological, biological, political, and theological contributions of networked communications to everyday life. This will add flesh to the idea that the Universe is divine. The idea behind all this is that all our experiences of the Universe are fixed points in the divine dynamics. These fixed points interact with one another through networks rooted in God.

Back to top

5.2: Human networking: telecommunication

Error free communication over long distances is not easy. Before the invention of writing, information was transmitted orally by travellers and traders. In Greek mythology, the Gods communicated with humans through their messenger Iris, the personification of the rainbow stretching from Heaven to Earth. The Hebrew and Christian Gods used angels to communicate with people (Greek angelos, Latin angelus = messenger). In the New Testament the angel Gabriel announced to Mary that she was to be the mother of Jesus (Luke 1:31). Iris (mythology) - Wikipedia, Angels in Judaism - Wikipedia

Once writing was invented, messengers could carry written texts. The Roman Empire was served by network of roads which were used by a postal service, the cursus publicus, established by the emperor Augustus. Similar systems to transport people, goods and mail exist throughout the world. Once their maximum speed was limited by the horses ridden by despatch riders and horsedrawn wagons and coaches. Other means of signalling, using flags, mirrors, smoke and fire were also devised for special purposes, often military. Such long distance communication made the government of nations and empires possible. Roman Roads - Wikipedia, Cursus publicus - Wikipedia

The first major improvement on postal systems is the telegraph which transmits information using electric signals carried by wire. These signals travel at close to the speed of light. Cable based telegraphy expanded from its invention at the beginning of the nineteenth century to encircle the world by 1902, radically changing global communication. Electrical telegraph - Wikipedia

Telegraphy enabled writing at a distance as the word suggests. Messages are generally sent letter by letter using a binary code of short and long pulses such as Morse code. Letters are represented by sequences of pulses. Later in the nineteenth century the invention of microphones to transform sound waves into electrical waves, and speakers to perform the inverse transformation, enabled the development of telephony. Telephony allowed people to talk to one another naturally and did not require specialist operators. Like telegraphy, the development of telephony was a long evolutionary process which culminated in the award of a patent to Alexander Graham Bell in 1876 and the rapid expansion of wired telephone networks.

The first telegraph and telephone lines operated from point to point. Those wishing to use the service took their messages to a telegraph office, which, like the post offices, transmitted their message to the recipient's office for a fee. The postal system directs messages using addressing written on each item so that they could be sorted and delivered to the intended recipients. Since telephone systems work in real time, the analogous development was the invention of the telephone exchange. Subscriber lines were concentrated at the exchange and could connected to one another, enabling messages to be directed from one user to another. Telephone exchange - Wikipedia

At first the switching required to connect different subscribers in the network was done by human operators. Automatic telephone networks began to appear at the beginning of last century and now most switching and messaging is digital. Much wiring has been replaced by optic fibre and telephone networks are rapidly becoming wireless..

The possibility of wireless telecommunication was revealed in 1864 when James Clerk Maxwell showed that light is electromagnetic radiation. Twenty years later, in 1887, Heinrich Hertz was able to generate and receive wireless signals over a short distance. By the end of the nineteenth century, Guglielmo Marconi had developed a practical wireless telegraph. Soon afterwards voice transmission became possible and radio swiftly expanded to become a global technology. Instead of using wired exchanges to connect individual subscribers, radio messages between different users are distinguished by frequency. Transmitters and a receivers must be tuned to the same frequency to connect with one another. Wireless - Wikipedia

Back to top

5.3: The network "atom"

The "atom" of a network is a connection, two sources exchanging information over a communication channel. We are all very familiar with the logistics of connecting, which is why the network protocol serves so well as a medium for understanding the universe. A network is a set of connections. The connections between sources are often temporary, a short conversation; all the sources in a given network can communicate with one another, and some pairs of sources may connect more frequently this others. This provides a measure of the distance between sources. We say people are close when the spend more time together.

In face to face social circumstances a connection may comprise a sign on, hello, handshake or hug; a conversation back and forth; and a sign off, goodbye. In some circumstances, where careful diplomacy is required, rather long rituals of introduction may be required before communication can begin. On the telephone we dial the number. If the called party is available an introduction and a conversation follows and when we are finished we terminate the call by hanging up. Machine communications follow similar protocols, often called handshakes. If we observe a network for a long period, we can estimate the traffic between various sources by seeing who is talking to whom, how often and how long. This information is sometimes called metadata and is used by communication companies for billing, by law enforcement to detect illegal dealings and by academics to measure social interactions.

Back to top

5.4: Noise: the mathematical theory of communication

A stable system requires stable communications, but there is so much communication going on in the world that crossed wires are a problem: All the messages I don't want to hear are noise for me. Fortunately the mathematical theory of communication shows us how to overcome much of the noise in our communication networks.

Noise is the enemy of accurate communication Communication engineers have to face the problem that signals are corrupted by noise, introducing errors into messages by confusing symnols. We have all had trouble understanding conversations at a noisy party or over a noisy phone line. The noise arises because the Universe is a very lively place, ceaselessly in motion. In almost any communication, some of that motion is the signal we want and some of it is unwanted. Of course noise for one source may be the signal for another. A crude measure of the quality of a connection is the signal to noise ratio, which measures the ratio of the energy in the signal to the energy in the noise. As the noise begins to get louder in proportion to the signal the quality of communication is degraded. Signal-to-noise ratio - Wikipedia

Claude Shannon, working for Bell Telephone, studied this problem and developed a mathematical theory of communication. Bell was faced with the need to get intelligible conversations through metal phone lines 3000 kilometres across the United States. The problem was that noise from the wires and from the amplifiers needed combat the attenuation of the signal in long wires worked to introduce so many errors that speech could become unintelligible. Claude Shannon - Wikipedia

Shannon's theory established the possibility of error free communication. No matter how noisy the communication channels between them, sources can exchange information with negligible error. The cost, when a channel is noisy, is reduced speed. Shannon's theory does not tell us how to do this, only that it is possible. Much thought and ingenuity has subsequently gone into devising codes that approach Shannon's ideal. Coding theory - Wikipedia

Shannon was not concerned with the meaning of messages. His task as a communication engineer was to ensure that the receiver of a communication gets exactly the same set of symbols in the same order as that sent by the transmitter.

Shannon devised a measure of information which ignores the meaning of a message: it is simply a function of the variety and frequency of the symbols the sources exchange. This measure, entropy, also serves as a measure of information: the information carried by a symbol in a space of possible symbols is equal to the entropy of the space that the symbol occupies.

We imagine a source A which uses i different symbols, the source alphabet, ai. On the keyboard I am using, for instance, there are about 100 symbols: upper and lower case letters, numbers, punctuation and special characters. Each symbol has a certain probability pi which we can estimate by counting a large enough stretch of the text. In English we know that the space and the letters e, t, a, o and i are used more frequently than the letters k, x, j, q and z.

The formula for source entropy H, is

H = - ∑i pi log pi

The entropy of a source is the average amount of information a source transmits or receives per symbol. The entropy of a message is the entropy per symbol multiplied by the length of the message. The time rate of flow of information in a channel is called its bandwidth. Bandwidth is the most important physical parameter in the design of communication networks where time is of the essence. We use it, for instance, to measure the speed of our internet connections, usually in bits or bytes (8 bits) per second. The cost of a connection is usually a function of its bandwidth (speed) and the total volume of data transmitted. Bandwidth (computing) - Wikipedia

This function is at its maximum when the probabilities pi are all equal. One way to increase the entropy of an English source, therefore, is to transform it into a set whose symbols are equiprobable. Morse code used in telegraphy approximates this ideal by using shorter codes for the more frequent letters. Morse code - Wikipedia

If the communication between two sources is perfect, the receiver receives exactly the same string of symbols as the sender transmits. This property can be checked by the receiver sending the message back to the sender, which can then compare what comes back with the original message. Asking a source to repeat what it has been told is a standard method for checking the integrity of communications.

Shannon's insight is supported by the fact that I can download many gigabits of information over my scratchy phone line without any errors.

The key to Shannon's theory is the observation that errors are caused by symbols being confused with one another. The probability of confusion is greater if the symbols are closer together. Security forces listening to a phone call might here someone say "I am going to give arms to the poor" when what they actually said (and meant) was "I am going to give alms to the poor". The words are confused because they sound almost identical. Claude Shannon: Communication in the Presence of Noise

This suggests that error can be reduced by increasing the distance between symbols. Shannon's idea is to increase the size of the space of symbols while reducing the number of symbols required to send a given message, so that the symbols are further apart. This strategy is made possible by the fact that the size of the message space grows exponentially with the size of the messages that occupy it. So, given 100 symbols, we can make ten thousand two symbol strings, a million three symbol strings, a hundred million four symbol strings, and so on.

If we look in the dictionary, however, we find that there are only a few thousand four letter words, that is legitimate four letter strings. This tells us that only about one in a hundred thousand four letter strings are real words in a particular language, and the rest are meaningless. This suggests that it should be possible to place the real words a long way apart in the message space so that they do not become confused with one another. If we see something that is not a legitimate word, we know it is wrong and can ask the sender to repeat it.

The transformation of messages to prevent error is called coding. When we talk, for instance, we code our mental states into packages like words and sentences In natural languages, there are many words that are quite easy to confuse, but by assembling them into sentences we provide some context to similar words which reduces the chance of confusion. In networks like the internet the sender uses coding to transform messages so that all the legitimate packets are as far apart as possible. The receiver uses the corresponding decoding to recover the original message. The coding process requires computation, so we find the internet, one of the technological children of Shannon's theory, replete with computers to encode and decode the messages between individual sources. Our minds perform a similar function, unconsciously, as we talk to one another. The software that codes and encodes messages is called a codec.

Shannon's theory tells us that a well coded signal is statistically indistinguishable from noise. This is because its entropy is at the maximum and the symbols are equiprobable. Where a signal differs in fact from almost all sequences of noise, however, is that is is the output of an encoding algorithm. This signal can be fed into the corresponding decoding algorithm to yield the original message. Its not just noise, it is also a signal. Although the world appears to be full of noise, we cannot be sure that the noises are not messages until we have explored all possible decodings, usually an impossible task. This is a problem that confronts those searching for extraterrestrial intelligence and security organizations trawling the communication networks for clues to illegal operations.

We call the set of algorithms for encoding and decoding messages a language. Systems that speak the same language can communicate with one another easily, whereas those do not that have difficulty. Since the only constraint on a codec is that it be computable, there are a countable infinity of different languages, so the distances between them can be very great. By devising a private language implemented by a shared codec, sources can hide their communications from eavesdroppers. This technique is called encryption and is an essential method of maintaining privacy on the public networks that encircle the globe. Encryption - Wikipedia

Back to top

5.5: From the Turing machine to electronic computers

Newton's application of calculus to the study of the heavens motivated mathematicians to take a close look at the infinite and the infinitesimal. The issue had been studied in antiquity by people like Zeno who used the idea that we can interpret motion as an infinite sequence of infinitesimal steps to arguments that motion is impossible. Zeno devised these arguments to support his teacher Parmenides who claimed that true reality is one and immobile. He thought that the moving world is a delusion. Zeno's paradoxes - Wikipedia

Zeno's paradoxes have refused to die despite the best efforts of philosophers and mathematicians to kill them off. As Bertrand Russell noted in 1903:

In this capricious world nothing is more capricious than posthumous fame. One of the most notable victims of posterity's lack of judgement is the Eleatic Zeno. Having invented four arguments all immeasurably subtle and profound, the grossness of subsequent philosophers pronounced him to be a mere ingenious juggler, and his arguments to be one and all sophisms. After two thousand years of continual refutation, these sophisms were reinstated, and made the foundation of a mathematical renaissance . . .. Russell: The Principles of Mathematics, page 347

Maybe Zeno was right. The advent of quantum mechanics has suggested that all motion proceeds stepwise in finite quanta of action. The notion of logical continuity introduced below suggests that we can make logical continua out of stepwise logical arguments.

Zeno's paradoxes were reexamined mathematically in the nineteenth century. From our point of view, this effort culminated in the work of Georg Cantor. Cantor developed and applied set theory to the study of infinity and devised the transfinite numbers. On close examination, set theory itself yielded paradoxes, one of which, Cantor's paradox, was observed by Cantor himself. Georg Cantor - Wikipedia, Cantor's paradox - Wikipedia

David Hilbert, like Plato and many others before him, thought mathematics to be perfect. He saw this perfection in three dimensions: consistency (no contradictions), completeness (no unanswerable questions) and computability (no insoluble problems). Stanley Burris: Hilbert and Ackermann's 1928 Logic Book

To make sense, mathematics must be consistent. It would make no sense if both the propositions x is true and x is false could be proven from the same starting point. If this were the case nothing could be certain, and we might see mathematics as a waste of time.

Kurt Gödel was able to show in 1931 that consistent mathematics can lead to uncertainty. We can devise propositions, that is legitimate strings of symbols, that cannot be proved either true or false. This means that mathematics is incomplete, there are unanswerable questions. Kurt Gödel I: Kurt Gödel I: On formally undecidable propositions of Principia Mathematica and related systems I:

In 1936, not long after Gödel's work, Alan Turing showed that there are mathematical computations that no computer can complete. In other words, mathematics includes incomputable problems. Gödel and Turing revealed that the formal perfection of consistent mathematics is consistent with uncertainty. Alan Turing: On Computable Numbers (ref above), Andrew Hodges (1983): Alan Turing: The Enigma

To prove incomputability, Turing devised a formal logical machine that could compute anything that might reasonably be called computable. He then showed that there were things that this machine cannot compute. He also found that the cardinal of the set of different computers (considered as a union of hardware and software) is the same as the cardinal of the set of natural numbers, both sets are 'countably infinite'. Their cardinals are represented by Cantor's symbol 0, the first transfinite number. Cardinal number - Wikipedia

We do not need a physical computer to understand Turing's paper because we can imagine the processes defined by the mathematical symbolism. The formal statements are in effect brought to life in our minds, a process we understand to be the product of a huge network of neurons.

Gödel and Turing's discoveries may explain the uncertainty and unpredictability we experience in the world. Some people, following Laplace, think that the world is deterministic, but these discoveries show that this is logically impossible in a consistent world. If the Universe is divine, this suggests that there are logical limits to the knowledge and activity of a self-consistent God. Laplace's demon - Wikipedia

Gregory Chaitin has shown that Gödel's theorem is very closely related to the cybernetic principle of requisite variety: a simple system cannot control a complex system. The controller must have at least as much information processing power as the system it is controlling to be able to deal with anything the controlled system may do. So we try to build our machines to preclude every possible avenue of error. Chaitin's approach tells us that the conclusion of a proof cannot be more complex that the inputs to the theorem. Complex outputs require complex inputs. Chaitin: Gödel's Theorem and Information, Ross Ashby: An Introduction to Cybernetics

Using Turing's ideas, Gregory Chaitin has translated Gödel's result into the language of algorithmic information theory. The important result here for us is that no deterministic process (that is a computation) can produce new information. Indeterminate random variation is therefore necessary for the creative evolution of new structures. Algorithmic information theory - Wikipedia

We observe that the future is often a lot more complex than the past, so we might conclude from this that the past can control the future if and only if the complexity of the future is reduced to the complexity of the past. This is the value of a computer. It is a deterministic machine whose behaviour is so constrained that it can carry us through time from an initial state to a final state without uncertainty. Often computers get themselves stuck in a loop, but this can be fixed by the most basic computer engineering algorithm of all: turn it off and turn it on again, that is reboot it.

Turing's negative mathematical result placed boundary on computation but his method of proof exposed the power of mechanical computation, setting off a computing revolution which has transformed our world. While there are the uncomputable functions there is also a huge space of computable functions, corresponding to those Turing machines that do arrive at a result. Since there are 0 different Turing machines, there are this many different computable functions. These computable functions are the foundation of the computation and communication industries. Internet - Wikipedia

Under the pressure of war, the formal system devised by Turing and his contemporaries began to be embodied in hardware that was capable of performing all the operations necessary to execute a computation. The ideas that had first existed encoded in the neurons of Turing's brain began to migrate to a new physical substrates made of metal and glass.

The Second World War began a few years after Turing wrote his seminal paper. He began to apply his genius to breaking the encryption of Nazi military communications. He began to devise techniques to facilitate this work and machines to implement those parts of it which could be reduced to mechanical computations. The first generation of these machines were known as bombes. The codebreakers then designed machines known as colossus which were the first programmable computers. A little later ENIAC was built in the United States, the first general purpose computer. ENIAC weighed 30 tonnes. Since then computers have gone through a rapid evolution which has led to hand held computers in phones that are many millions of times more powerful than ENIAC. Bombe - Wikipedia, Colossus computer - Wikipedia, ENIAC - Wikipedia

Since computers are necessary to encode messages so as to exploit Shannon's methods of error free communication, limitations of computation imply limitations on coding and error free communication. Whereas the uncertainty induced by Gödel's theorem may be thought of as lying in the space domain, the uncertainty arising from Turing's theorem is in the time domain. In life, both time and space are valuable assets and they are both quantized (pixellated) in multiples of Planck's constant. Time and space are both of the essence in questions of fitness. This is demonstrated by most sports, where being in the right place at the right time is an important prerequisite for victory.

An axiom of special relativity is that no causal influence can travel faster than the speed of light. This idea is represented diagrammatically by the light cone. An event at the origin of the light cone can be influenced by events in its past light cone and can influence events in its future light cone, but can have no contact with events outside its light cone. We can see this idea at work in ball games like tennis. Each player tries to place the ball outside their opponent's interception cone so that they cannot get to the ball quickly enough to return it. Light cone - Wikipedia

Points outside each other's light cones are said to be spacelike separated. This is true in physics, but as the tennis example above shows, it has wider application. In its early days the British invasion Australia was managed from England. Since it often took the best part of a year to travel from the England to Australia in a sailing ship, the imperial government's reaction times to events in the colony were about two years. The formal mathematical limits on control in a communication network measured by complexity must be extended to take account of the time it takes for the controlling system to react to events in the system controlled. We experience this personally when driving in heavy and chaotic traffic. The purpose of the road rules is to bring the space of permissible moves down to a level which can be managed most of the time by the average driver.

Back to top

5.6: Two worlds: classical and quantum

Turing machines and the electronic computers derived from them inhabit the world of classical deterministic processes. Turing described his machine with a human computer in mind, working with a pencil and paper as people did in those days. We understand and design the electronic components in modern computers using quantum mechanics, just as we might use quantum mechanics to understand the structure of the steel in hammers and nails. Nevertheless the components of computers are designed to work classically in a manner exactly analogous to the human computers of Turing's day, writing and erasing, adding, subtracting and evaluating logical functions. Computer - Wikipedia

The motions of a computer are controlled by a clock which sets all is components in motion simultaneously, waits long enough for everything to settle into its new state, and then sets off the next move. One limitation of the speed of a computer is the time it takes the clock signal to propagate throughout the machine. The clock serves to separate the physical dynamics of the computer from its sequence of formal logical states so everything is kept in sequence and synchronized.

A second limitation on computing speed is the amount of data processed by each step in the computing process. From a formal point of view one can construct a computer from a suitably connected set of nand operators which implement the logical function not-and (aka the Sheffer stroke), which accepts two bits and gives the single output false if both input bits are true. A single nand operation yields one bit per clock cycle. Things can be speeded up by processing larger chunks of data per clock cycle. This development began with bytes of 8 bits, and many machines now use words of 64 bits.

Since the advent of integrated circuits in the early 70s the integrated circuit industry has more or less followed Moore's law until the present time, approximately doubling the number of transistors in an integrated circuit every two years. This has led to a continuing decrease in the cost of computing machinery with respect to power, but this cannot go on forever, since the components will eventually become so small that quantum uncertainty will compromise their classical determinism. Moore's law - Wikipedia

Quantum computation seeks to go behind the classical scenes to exploit the actual computational processes of the quantum world. This is a natural development. Quantum mechanics is a mathematical discipline which provides a hypothetical mathematical description of how the world works. Physicists use this theory to calculate how quantum systems might behave.

Classical physics is deterministic and cannot explain the creative nature of the universe. To do this and map the physical world onto God we need quantum physics. The development of quantum theory has been a long struggle. It seems rather strange and counter-intuitive from the classical point of view and describes a world which seems very different from ours. The interface between the two worlds is known as the quantum measurement problem. This interface remains controversial 120 years after the foundation of quantum mechanics and has led to a lot of rather remarkable speculation. After a discussion of classical networks will go on to suggest that quantum mechanics also describes a networked structure. Measurement in quantum mechanics - Wikipedia

Back to top

5.7: Finite classical computer networks

We now turn to the application of a communication network model to modelling the relationships between the observable fixed points in the world, which we understand to be messages transmitted between instances of the quantum mechanical processes that underlie our classical world. These messages are physically embodies as particles of all sizes from the fundamental particles studied by high energy physics to the planets, stars and galaxies studied by astronomers.

Formally a network is a set of communication links which we call the atoms of the network, analogous to a chemical atom made of electrons and protons held together by electromagnetic communication. In the human world a source may be a person, a church, a nation or anything else that can send and receive messages.

Error free communication is not easy. Initially Shannon's ideas had to await for the development of computers to do the encoding and decoding in order to achieve the high precision error prevention and correction necessary for computer networking. Technically the internet has been an enormous success. Like most technical developments, it has its upsides and its downsides from a social and political point of view. Here it is simply as an example of a large network connecting billions of users. Compared to the network that makes an human body work, it is infinitesimally small. Below we will outline a system of a transfinite networks sufficiently large to take care of networked systems of any consistent size and give an address to every quantum of action in the universe. History of the Internet - Wikipedia, Armen Zemanian: Transfiniteness for Graphs, Electrical Newtorks and Random Walks

The foundation of every network is a physical layer, which may be anything from roads and beasts of burden to messengers, letters, wires, wireless and light rays. Here we understand the ultimate physical layer to be the initial singularity, representing a quantum of action. All these media represent and transport physical states which can be exploited to carry information. The correct language to describe the physical layer is quantum mechanics, but in classical networks the electronic hardware is designed to behave in classical and predictable ways. Tanenbaum (1996): Computer Networks

A computer is itself a network, a set of components communicating with one another. At the heart of the machine is a clock which provides timing pulses that keep everything synchronised. The other hardware components are a processing unit, memory and interfaces to communicate with users and other computers.

The principal differences between a computer and a computer network are that the computers in a network may operate at different frequencies, that they are usually some distance apart, that the interfaces between the different machines on the network need to have the ability to correct communication errors and the machines must share a common set of protocols to translate information from one machine to another. Larger networks may also have dedicated machines to provide a routing system to direct traffic between the machines. There is no particular limit to a size of a network. All that is necessary is an address space large enough to provide unique addresses to all the resources on the network. In the case of the internet this is known as the URL (Universal Resource Locator). Telecommunications industry - Wikipedia, Computer network - Wikipedia, URL - Wikipedia

To simplify construction and maintenance, the software in a computer network is layered. Each layer is given a fixed task in the transformation of a message from users to the physical layer and back again. Each layer provides communication between the layers above and below it and transforms this information to couple the layers it serves. OSI model - Wikipedia

The first layer of computation interfacing with physical layer may be devoted to encoding and decoding signals so as to provide error free transmission for all subsequent layers. Subsequent layers of software add more and more functionality. In the human world, the top layer is traditionally a person but and increasing number of machines ("things") interact on telecommunication networks.

The Universe itself is layered network, beginning with a physical layer of fundamental particles and gradually developing more complex structures until we come to ourselves, our planet, and the Universe as a whole. In the theological model, we may consider the universe itself as the ultimate user of all the sub-networks that constitute it. The ubiquity and functionality of networks qualify them as a promising structure to be developed as a theory of everything, that is a theology. They have many useful properties:

a) they create real relationships

We create and maintain our human relationships by communicating with one another. It seems that all other relationships, bonds and structures in the universe are also built on networked communication.

b) they are logically continuous

We may understand a halting Turing machine as a logically continuous mechanism. Given a certain initial state, it will move by a series of deterministic steps to a final state which is a logical consequence of the initial state: it executes a proof. Modern digital computers and a error free digital communication channels are also logically continuous in the same sense. The output of an error free transmission channel is identical to its input apart from being relocated in space-time to a point on or within the forward light cone of the source.

Mathematics often imagines continua as comprising very large numbers of discrete points pushed together, which sounds a bit like a contradiction in terms. Aristotle produced a more consistent definition continuity, having extremities in common, and computing machinery uses a similar approach. Machines communicate by reading and writing to the same location in memory. We have an intuitive grasp of logical continuity and expect stories to make logical sense even if, like movies, they are presented cut up into small pieces which eventually come together like a jigsaw. We like every part of a story to be part of the overall picture. One of the tricks of storytelling is to conceal central issues in apparent irrelevancies which ultimately emerge as keys to the plot. Aristotle (continuity): 227a10 sqq

c) they may be made redundant and so fault resistant

The road network of a country or city normally provides many distinct routes to get from A to B. If one route is blocked, another is probably available. This property may be built into any network, so that a route may be found between any two sources, at least until the network is so badly damaged that it is divided into completely disconnected sections.

d) they can be logically mapped

A problem facing any network user is to decide which is the best route to take between two points. The usual way to do this is to consult a map of the network. Networks like the internet have machines devoted to routing, and the packets comprising a long message may take different routes from point to point. Engineered networks operate like a postal system. Each packet of information carries a destination address which the routing machinery uses to send it on in the right direction.

e) they embody both determinism and randomness

Networks, like the internet, can be designed to be deterministic, but uncertainty can enter in three ways. The first is error. The second is by lossy encoding, used when some loss of detail is acceptable in order to reduce the size of messages and the cost of transmission. The third is by the interruption and redirection of one process by interaction with other processes. Our lives in our social networks, for instance, are regularly redirected by our meetings with other people and events.

The effect of this uncertainty it to introduce some randomness into a networked system enabling it explore the whole space of possible connections. This space is defined by the transfinite numbers introduced below. This situation lays the foundation for an evolutionary process. Many potential connections will lead to inconsistency and failure. Others will be relatively stable. In the Darwinian model, stable relationships are the ones selected for survival.

f) their future is uncertain but their past is fixed

There may be many different routes through a network and the actual course of processing may depend on random events like the exact moment at which a machine is interrupted and its process redirected. In retrospect, however, the actual path taken is determinate, and in the case of the whole universe, can (in principle) be traced back to the initial singularity. In practical networks the machines write logs that can be used to trace errors and defects.

g) they increase complexity by copying

The function of a communication network is to copy information one from point to another in space-time. This generally increases the overall complexity of the system. This tendency is opposed by erasure. The death and decay (annihilation) of individual particles releases resources for new connections.

h) they are scale invariant

Fundamental particles, atoms, molecules, people, stars and and galaxies are all instances of communication sources. The network paradigm applies regardless of the size of the communicating sources and the messages they transmit. This symmetry with resect to complexity makes networks a suitable vehicle for a comprehensive theory of everything. Below we will build a transfinite picture of the universe beginning with the quantum of action which may be understood as the embodiment of the logical not. Every act transforms some p into some not-p

i) they reflect the layered structure of the universe

Networks are layered, beginning with the physical network. Each layer uses the services provided by the layer beneath it to serve the layer above it. We see a similar structure reflected in the world. Fundamental particles communicate with one another to form the networks we call atoms. Atoms network to form molecules, molecules cells, cells multicellular creatures. Living such creatures network to form ecologies and so on. Each of these layers can exist without the layers above it, but relies completely on the layers beneath for its existence. Many more intelligent systems manages their foundations to increase their chances of survival. On the model proposed here, everything depends on the existence of the initial singularity which we identify with the traditional God. The survival of species (including ourselves) requires the maintenance of the ecosystem upon which they depend.

j) they may be multiplexed in time and space

Sources which are closely bound like electrons and nuclei may be in regular communication with one another, but most acts of communication are discrete entities with a finite lifetime. Much of the machinery in computer networks like the internet is devoted to making and breaking connections between different addresses.

k) they embody coding and decoding: language and meaning

The mathematical theory of communication is not concerned with the meaning of messages, but with accurate transmission.

The embodiment of meaning in messages is the work of the users (sources) in the network. In the case of the internet, the users are predominantly human. As in face to face communication, the meanings of messages are introduced by the users encoded in a language common to them both.

Like the traditional God the initial singularity, taken by itself, means nothing. Formally, meaning is established by correspondence, as in a dictionary. Physically it is established by bonding which is a relationship established by communication.

Atoms, although infinitesimal, are already quite complex structures. Each new layer of structure in the universe adds new layer of meaning which is encoded in the syntax of the structure. This relationship between layers is universal, so I may consider myself as an element of the human layer on Earth, relying on many ecosystem and economic layers for my survival and contributing to all the networks within which I am a source.

This structure establishes that the meaning of my life can be traced back to the initial singularity, and my life contributes to the meaning of all the systems of which I am part.

l) they enable testing and selection

The creation of anything, even a written sentence, requires trial and error, connections and disconnections. Evolution and the history of technology demonstrates this beyond doubt, since no species or technological product has ever emerged in its final form. Design is an evolutionary process of copying with variation. On the whole the variations are random. They cannot be predicted, but just happen and must be found. New designs are tested and selected when they begin to communicate with their environment.

m) there is no logical upper bound to the size or complexity of networks

From a logical point of view, there is no limit to the size of networks. Nevertheless, communications are limited by the velocity at which messages can travel. In the classical physical Universe as a whole this is limited by the velocity of light. Nevertheless deterministic local connections are available everywhere, and we will see that quantum entanglement appears to operate outside the constraints of space-time. Einstein, Podolsky and Rosen: Can the Quantum Mechanical Description of Physical Reality be Considered Complete?, Daniel Salart et al (2008): Testing the speed of 'spooky action at a distance'

Although there are only a countably infinite number of computable processes, there is no limit on the number of each process that can be implemented at different addresses in a network, endowing it with parallel processing power beyond the reach of a single computer.

Back to top

5.8: A transfinite computer network I: Formalism

The traditional God is understood to be infinite. The term infinite has two related meanings, very big and without boundary. Aquinas asks if God is infinite, and answers yes. His argument is based on Aristotle’s hylomorphism:

. . . form is not made perfect by matter, but rather is contracted by matter; and hence the infinite, regarded on the part of the form not determined by matter, has the nature of something perfect. Now being is the most formal of all things, . . . Since therefore the divine being is not a being received in anything, but they is their own subsistent being . . . it is clear that God themself is infinite and perfect. Thomas Ainsworth (Stanford Encyclopedia of Philosophy): Form vs. Matter, Aquinas, Summa: I, 7, 1: Is God infinite?.

From Aquinas’ point of view, God is infinite because form without matter is unbounded. Modern cosmology supports a similar argument from the large scale structure of space-time described by the general theory of relativity. This space is curved and closed in the sense that one can travel an unlimited spatial distance forward in time and come to no boundary, unless one crosses the event horizon surrounding a black hole.

Going back in time (which is theoretically but not practically possible) one also comes to a boundary to the universe known as the initial singularity. I have noted above that this point is formally identical to the traditional God, being both absolutely simple and the source of the Universe. The principle difference is that while traditional theology holds that God created a Universe distinct from itself (and could have made it very different), the cosmological view is that the Universe emerged within the initial singularity through the big bang. Hawking & Ellis: (1975): The Large Scale Structure of Space-Time

Insofar as we understand that there is nothing outside the Universe, so that it is not contained in any way, Aquinas’ argument for infinity given above still holds. In this section I suggest that looking from inside the Universe we may also see that it is infinite.

Nowak pointed out that unlike most of the communication between other animals, human communication is syntactic, making infinite use of finite means. This strategy introduces an exponential increase in the detail that we can communicate. While an animal may grunt in many tones of voice, and birds can often sing long, beautiful and significant passages, we think that the human feeling that can be communicated by speech and gesture are very much more complex. Nowak, Plotkin & Jansen (2000): The evolution of syntacic communication

Turing’s computer gets its power by taking advantage of syntax. The working memory of the computer is an ordered countably infinite string of locations which the processor can read and write. As with the natural numbers, if the machine runs out of locations, it is free to add another. Modern computes and networks differ in detail, but they remain deterministic machines whose behaviour is depends upon symbolic input transformed by a stored program to give output.

In mathematics, syntax enables us to construct numerals to represent any cardinal number by forming ordered sets of digits, as in decimal numbers. A countably infinite string of decimal digits is considered adequate to represent any element of the set of real numbers. Toward the end of the nineteenth century, George Cantor set out to use syntactic methods to represent the cardinal of the continuum and realised that ordered sets of symbols could be used to represent anything representable.

Aquinas’ argument is very simple and Aristotelian, but since Cantor's time modern mathematics has a lot more to say about infinity which is relevant to the structure of the Universe. It enables us to represent an abstract picture of the infinite dynamic system, which may be understood as the mind of God, in which we live.

Isaac Newton invented calculus to derive, explain and calculate his System of the World. Calculus models continuous motion in space and time by considering the ratio of distance travelled over time taken in progressively shorter and shorter intervals. Physically is seems quite reasonable, but mathematically is raised many of the problems first studied by Zeno in the fifth century bce. Isaac Newton: The Method of Fluxions and Infinite Series, Nic Hugget (Stanford Encyclopedia of Philosophy): Zeno's Paradoxes

Many mathematicians, particularly in the nineteenth century tried to produce a rigorous account of calculus and the related subjects of continuity and real numbers. Much of this work modelled the continuum as an infinite set of discrete points, which led Georg Cantor to investigate the cardinal of the continuum, the number of points it takes to constitute a continuum. Like Newton, he invented a new branch of mathematics in the process, set theory. Georg Cantor (1897, 1955): Contributions to the Founding of the Theory of Transfinite Numbers, Thomas Jech (1997) Set Theory

He defined a set (aggregate, Menge,) as any collection into a whole M of definite and separate objects m of our intuition or our thought. The beauty of a set is that it is not only a finite container which can be talked about and manipulated, but it can contain an infinite number of elements, its cardinal, effectively putting handle on infinity.

The basic computational process in set theory is the establishment of one-to-one correspondences. This is a local process which is indifferent to the cardinal of the sets being compared. Two sets have the same cardinal if each elements of one can be matched with corresponding element of the other and none are left over on either side.

In addition to their cardinal number Cantor’s sets have one other property, an ordinal type.' The ordinal type of M is itself an ordered aggregate whose element are units which have the same order of precedence amongst one another as the corresponding elements of M, from which they are described by abstraction.'

Cantor began with the infinite set N of natural numbers. The cardinal of this set cannot be a particular natural number, because we can always add 1 and get a larger number so he named the cardinal of the set 0, using the first letter of the Hebrew alphabet. He called 0 the first transfinite number. He then goes on:

To every transfinite cardinal number a proceeding out of it according to a unitary law, and also to unlimitedly ascending well-ordered aggregate of transfinite cardinal numbers, {a}, there is a next greater proceeding out of that aggregate in a unitary way.

He named the cardinals of these the increasingly large sets 1, 2, . . . n, . . . ..

This unitary way is based on the fact that there are many ways to order a given set by constructing subsets, combinations and permutations of its elements.

Let us suppose the existence of the power set P(S) of S, the set of all subsets of S. P(S) is in effect the collection of all the combinations of the elements of S . The Cantor theorem then asserts that the cardinal of P(S) ) is greater than the cardinal of S. Cantor's theorem - Wikipedia

Cantor’s hope of representing the cardinal of the continuum was dashed by Cohen’s argument that the continuum hypothesis is independent of set theory. The heart of the problem is that the concepts of set and cardinal number are independent of one another. This is both a strength and a weakness of sets. Nevertheless, set theory has become a foundation of mathematics and shows how we can model a transfinite computer network. Paul Cohen (1980): Set Theory and the Continuum Hypothesis

Cantor's theory upset some theologians who felt that infinity is the unique attribute of God and there can be no 'created' infinity. This problem was avoided by Cantor's tacit use of formalism. Although in reality all information is represented physically, mathematicians are still free to imagine that the symbol N may stand for anything consistent, such as the infinite set of natural numbers.

Cantor's ideas seem to have had their foundation in his theological views. Hallet writes:

It is clear that Cantor understands pure set theory as a quite general foundational theory which prepares the way for any theory which uses or relies on sets or numbers. But now we come back to theology and God, for this foundation, this understanding of what numbers are, or what sets etc exist, is for Cantor intimately connected with the attempt to understand God's whole abstract creation and the nature of God himself. Michael Hallett (1984): Cantorian Set Theory and Limitation of Size

Cantor set out to define an absolute infinity which was to be characteristic of God. He found that the existence of such an infinity was not self consistent, something now known as Cantor's paradox. Cantor had proved that every set can generate a larger set. This must hold for any candidate absolutely infinite set, which is therefore no longer absolutely infinite. Joseph Dauben: Georg Cantor: His Mathematics and Philosophy of the Infinite

From this we may conclude that transfinite mathematics does not have an element corresponding to God. Instead it has a series of elements which may approach but never reach the immensity of God. We can talk about these subsets of the whole (sometimes called universes of discourse) without contradiction. Further, Cantor's theorem guarantees that no matter how large a universe of discourse we decide to study, it will remain always a subset of the strictly greater set arising from the Cantor expansion of our chosen universe. It may be that this network structure has sufficient power to represent every quantum of action in the life of the universe.

A memory is a system which changes state under the action of some force, and remains in that state until its state is changed again. The source of the force that changes the memory we may call a processor. From an abstract point of view, both a computer and a computer network is a set comprising processors and a memories: computer = {processor, memory}. This is the structure Turing's original abstract machine. Each element of computer memory has two properties: an address and a set of internal states which may be read from and written to by a processor. Ideally each element is independent so that their states are completely determined by the signals they receive. In mechanical computers memory elements usually have two states, often represented by 0 and 1. In neural systems, information is stored in synapses which may have a wide range of states. Memory is an ubiquitous feature of the universe. In the classical world most things stay in one state unless they are moved by a force. This fact serves as a definition of both a state and force. On the other hand quantum states, even stationary states, are all considered to have energy and therefore to be always in motion. Synapse - Wikipedia

The memory of the transfinite computer network is modelled on and addressed by the cantor space of transfinite number. This is based on the two set theoretical notions: cardinal and ordinal numbers. Ordinal number - Wikipedia

Cantor proof used the power set which can be interpreted as a set of combinations. We can imagine an even bigger set, of permutations. The number of permutations of the natural numbers is the number of ways of mapping the natural numbers onto themselves. So we could write as 1 = o!, the cardinal of the permutation group of natural numbers. By Cayley's theorem, every group is isomorphic to some permutation group. This may explain why group theory is of such value in mathematical science. Permutation group - Wikipedia

Finite arithmetic is a deterministic process. 2 × 2 = 4, 22 = 4, log24 = 2 and so on. Transfinite arithmetic, on the other hand, is more flexible, as the following expressions show:

n + n = n × n = n
2n = nn =n+1
log2n = lognn = n−1
n! = n+1

Cantor felt that this construction was big enough to represent anything representable: He wrote:

The concept of 'ordinal type' developed here, when it is transferred in like manner to 'multiply ordered aggregates' embraces, in conjunction with the concept of 'cardinal number' or 'power' . . . , everything capable of being numbered (Anzahlmassige) that is thinkable, and in this sense cannot be further generalized. Cantor 1955, p 117

One of the developments which made practical computers possible is random access memory, RAM. RAM is structured so that the processor can reach any location in the memory directly without passing through any other locations. To achieve this the memory is structured in a treelike manner so that when the processor sends an address, the most significant bit of the address chooses one half of the memory, the next significant bit half of that and so on until the least significant bit chooses the actual location to be read or written. In a computer that uses 64 bit addresses, this enables the selection of one of 1.8 x 1019 memory locations. Random-access memory - Wikipedia

Random access memory is made possible by the three dimensional structure of space. Anyone trying to design a two dimensional circuit board with uninsulated circuits soon discovers that points on the surface become inaccessible because wires cannot cross without short cicuiting. In 3D space, however, and two points can be joined without crossed wires. We will return to this point in chapter 6 where we discuss the construction of a network universe.

Like Cantor, we might want a transfinite network to model a particle of significant complexity, a human body. Since we are dealing with a static, formal mathematical system, this model must be a snapshot, a static representation of a tiny moment of time. Quantum theory equates energy and time through the Planck-Einstein relation E = hf. Since frequency f is the inverse of time, we may approximate a static interval Δt by the equation Δt = h/E. My mass m is about 80 kilograms, E = mc2 and Planck's constant h is about 7 x 10-34 Joule.second, so Δt for my body is about 10-52 second, corresponding to a frequency of 1052, the processing rate of my life in terms of quanta of action per second. Planck constant - Wikipedia

We can imagine this snapshot as a record of the state of the memory of my body for this moment in time. At the lowest hardware level, each unit of memory contains a quantum of action which we may imagine as a fundamental particle. We may divine this memory into blocks equivalent to atoms, molecules, cells, tissues and so on to create my whole body and imagine each of the layers as a represented by a transfinite number. Although Cantor started his hierarchy using the countably infinite set of natural numbers, his proof remains true for any set, finite or transfinite, so the fact that a proton, for instance, has only about 11 parts, 8 gluons and 3 quarks, or a hydrogen atom 2 parts, a proton and an electron fits the system where we apply the rather flexible transfinite arithmetic shown above.

Back to top

5.9: A transfinite computer network II: Dynamics

The finite velocity of light means that the calculation above is quite unrealistic, since a light signal cannot even cross a proton in 10-52 second, so we must turn to a discussion of local dynamics to create a consistent application of a transfinite network to the universe. This means that Cantor's idea that his numbers could actually represent God or the universe died with the advent of special relativity and Minkowski space. Relativity suggests that the universe is inherently dynamic and all physical interactions require at least contact, if not continuity.

Aristotle studied space and provided a definition of continuity in his Physics which supposes that two lengths are continuous if their ends overlap (5.7b).

This definition is reminiscent of Aristotle’s syllogistic logic in which the premises leading to a conclusion overlap by a middle term. This suggests a notion of logical continuity which is instantiated in a mathematical proof, a computation or an error free communication (5.7b).

Here I use the concept of logical continuity to apply Cantors insights to the structure of the universe using a transfinite computer network as a backbone for the idea that the universe is a divine mind.

The power of Cantor's theory arises from two features: first, as Cohen pointed out, there is no connection between the notion of a set and the notion of cardinal number. We might say that sets are symmetrical or indifferent with respect to cardinality; and second, Cantor's method of comparing the cardinals of sets, the establishment of one-to-one correspondences, is local. We do not have to deal with a whole sets, just one element at a time from each. Just like counting sheep through a gate, we attach one numeral to each sheep and can, without formal inconsistency, use this method to enumerate an infinite flock.

Cantor began the construction of his transfinite numbers with the natural numbers which are countably infinite. Turing's halting computing machines can be viewed as a finite strings of elements, which we can identify with the memory tapes of the machines. The number of possible turing machines is also countably infinite, and so we can place them into one-to-one correspondence with the natural numbers and build networks of connected turing machines analogous to structure that Cantor used to generate the transfinite numbers. We imagine the computers as sources connected by communication channels so the output of one can connect to the input of another and the system of all possible connections represents a permutation group of natural numbers. Natural number - Wikipedia

We may think of any action as any process that changes some p into some not-p. This logical definition embraces anything from the emission of a photon by an atom to a supernova and beyond. In an electronic computer the basic action is a combination of read and write. A machine reads some an element of memory in order to write that content into another element. When you read this text, you are transforming this text into a signal which is written in the synapses of your brain.

A computer network has a physical backbone of electrical wires, optic fibres and wireless links comprising photons but the actual operation of the network is more like the postal system. The messages transmitted in both systems are addressed packages like letters or packets of data, and these are transmitted intermittently. It is impossible for everyone to communicate with everyone else simultaneously, and the same is true in the universal network. In a single computer, much of the addressing is wired in.

The basic protocol for connecting computers into a network is to allow them to read from and write to shared memory so that the output of one can be read into the input of another and vice versa. From a practical point of view, this is made possible by a communication protocol which encodes and decodes information so that both machines can use it. The biggest computer network on the planet, the internet, is based on a protocol suite which was developed between the 60s and 80s by a large number of individuals and corporations. Overall these protocols establish a common linguistic environment for sharing digital information with (relative) security. Internet protocol suite - Wikipedia

Another source of power in computation arises from set theory itself: it is the power of 'containerization'. Although I am a hideously complex organism, with trillions upon trillions of molecular parts, I go by one name, and can be processed under that name, as is done by the government and all the businesses and people I deal with. It is this power which enables us to encompass the countable infinity of natural numbers in the phrase the set of natural numbers, N. We think of this power in terms of meaning, my name means me, the set of my transfinite number of networked components.

Naming is not restricted to mathematicians and humans in general. It is an essential feature of all communication. Everything named by its scent, its image, the sounds it makes, the tracks it leaves, its address and so on. An animal seeking a mate can use all these clues to decide whether or not another individual is the one. The amount of information communicated is usually an infinitesimal portion of the information stored in the sources of the communication. Many of the sources I connect with like to be assured that I have but one unique name so that they know who they are dealing with.

In the physical world, naming is achieved by binding. The fundamental particles that make up an atom gain an identity: they are the particles of this particular atom. The particles are always on the move, however, so that they find themselves in different arrangements with different identities. This paradigm holds at all scales, as we can see in our own social interactions, forming and breaking bonds with different people for diferent reasons all the time. On the other hand, there are many symmetries in nature which enable one particle of a particular class, an electron for instance, to substitute for another. In the space of human rights we should all be considered to be identical, indistinguishably equal before the law and the processes of justice.

Back to top

10. Quantum mechanics describes a computable network

Experience has shown modelling quantum behaviour on a classical computer is very demanding. Yet if the world works according to our quantum models, these computations are carried out naturally in real time. Wilczek notes that it has taken many hours of supercomputer time to model the internal behaviour of a proton. Protons themselves do the same work in infinitesimal fractions of a second. Quantum computers support the dream of harnessing this power. Frank Wilczek: The Lightness of Being: Mass, Ether, and the Unification of Forces pp 112 sqq

Richard Feynman set out the rules of quantum mechanics in the third volume of his famous lectures on physics. After describing the "two slit" experiment he outlines the first principles of quantum mechanics: Double-slit experiment - Wikipedia

. . . An ideal experiment is one in which all of the initial and final conditions of the experiment are completely specified. What we will call "an event" is, in general, just a specific set of initial and final conditions. (For example: “an electron leaves the gun, arrives at the detector, and nothing else happens.”) Now for our summary.

Summary:

The probability of an event in an ideal experiment is given by the square of the absolute value of a complex number φ which is called the probability amplitude:

P = probability;
φ = probability amplitude;
P =|φ|2

When an event can occur in several alternative ways, the probability amplitude for the event is the sum of the probability amplitudes for each way considered separately. There is interference:

φ = φ1 + φ2
P=|φ1 + φ2|2

If an experiment is performed which is capable of determining whether one or another alternative is actually taken, the probability of the event is the sum of the probabilities for each alternative. The interference is lost:

P = P1 + P2

One might still like to ask: “How does it work? What is the machinery behind the law?” No one has found any machinery behind the law. No one can “explain” any more than we have just “explained.” No one will give you any deeper representation of the situation. We have no ideas about a more basic mechanism from which these results can be deduced. Feynman, Leighton & Sands FLP III:01: Chapter 1: Quantum behaviour

In the next chapter we will take a shot an an explanation by wondering how the Hilbert space of quantum amplitudes gives rise to the Minkowski space of everyday physics. Minkowski space - Wikipedia

Democritus thought of atoms as tiny little uncuttable things. Our modern understanding of atoms is similar, although they are no longer uncuttable. Our atoms have parts, the more obvious being protons, neutrons and electrons, and they can be taken apart. On the other hand, the real quantum mechanical atom is not a thing so much as an action or event, the quantum of action. The classical dimension of action is angular momentum, so we might think of the quantum of action as one turn of a cyclic group. In other words, motions in the Universe are not strictly continuous, although they may appear so in our macroscopic world, but comprise atomic steps of action whose size is measured by Planck's constant, an exceedingly small number.

Energy is coupled to wave motion. We see this spectacularly in huge ocean waves, but it also true on the smallest scale in the universe, measured by the quantum of action. Energy, like waves, is a sequence of actions. The fundamental equation of quantum mechanics is the Planck-Einstein formula E = hf, where E represents energy, h the quantum of action, and f frequency, which is the time repetition rate of quanta of action. The amplitudes of quantum mechanics are waves, each cycle of amplitude representing one quantum of action. Planck-Einstein relation - Wikipedia

We represent waves by complex numbers, the most fascinating species of mathematical objects. Ordinary numbers like 1, 2, 3, are linear. The just get bigger and bigger as we count higher and higher. Complex numbers, on the other hand have no natural order and are cyclic, which makes them perfect for describing anything cyclic, like a musical note or an ocean wave. The most fascinating complex number of all is the complex exponential. Feynman provides a brief introduction to algebra and complex numbers. Feynman Lectures on Physics, Vol. I, chapter 22: Algebra

One can learn a lot about waves by throwing stones into a tranquil pond and watching the ripples as they intersect. Part of a ripple is above the normal water level (we will call this the positive phase) and part of it is below, the negative phase. Positive and negative phase follow each other across the pond and where the ripples cross the phases add and subtract to create new waves. This process is called superposition or interference and it happens in quantum mechanics when the sources of different amplitudes are indistinguishable. Quantum superposition - Wikipedia

The technicalities of quantum mechanics are a little obscure, but they may be easier to understand in the network terms explained below. For the record, quantum mechanics represents the states or amplitudes of the world, |ψ>, by vectors in a complex Hilbert space. The complexity of the states is represented by the dimension of the Hilbert space, which may vary from 1 to a countable infinity. State vectors are superpositions of the basis vectors of the corresponding Hilbert space. Quantum amplitudes are invisible to us but we understand them to be represented by particles. We model quantum events when two amplitudes meet and 'measure' one another through the interaction of the amplitudes. Hilbert space - Wikipedia, Mathematical formulation of quantum mechanics - Wikipedia, Function (mathematics) - Wikipedia, Function space -Wikipedia

The heart of quantum mechanics can be expressed in six propositions. Three of these propositions are mathematical and embody the linearity and unitarity of quantum systems:

(1) the quantum state of a system is represented by a vector in its Hilbert space;

(2) a complex system is represented by a vector in the tensor product of the Hilbert spaces of the constituent systems;

(3) the evolution of isolated quantum systems is unitary governed by the Schrödinger equation:

i ℏ ∂|ψ> / ∂t = H|ψ >

where H is the energy (or Hamiltonian) operator.

Wojciech Hubert Zurek: Quantum origin of quantum jumps: breaking of unitary symmetry induced by information transfer and the transition from quantum to classical, Schrödinger equation - Wikipedia

The other three show how the mathematical formalism couples to the space-time world:

(4) immediate repetition of a measurement yields the same outcome;

(5) measurement outcomes are restricted to an orthonormal set { | sk > } of eigenstates of the measured observable;

Quantum state - Wikipedia

(6) the probability of finding a given outcome is pk= |< sk||ψ >|2, where |ψ> is the preexisting state of the system.

Born rule - Wikipedia

The energy operator may be represented by a square matrix of the same dimension as Hilbert space of the system of interest. The state vector of this system |ψ > is a superposition of the basis vectors of this space. The elements of the matrix encode the energy (rate) of interaction between the elements of the state vector.

A network comprises a set of nodes, agents or sources connected by a set of channels through which they can communicate. Our first step toward identifying the quantum axioms with a network is to map the sources in the network to the basis vectors in the Hilbert space of the quantum mechanical description.

If we think of each pair of vectors in a Hilbert space as defining a channel in a network, then the entries in the Hamiltonian matrix represent the flow of traffic on this channel. The Born rule computes the probability (that is the frequency) of traffic between the states represented by the two vectors.

Then:

Axiom (1): We let each basis vector in a state vector correspond to a unique source in a network. In computer terms, the state vector represents the state of the network memory, which is distributed among the basis vectors. This state is dynamical, as described by the Schrödinger equation whose solutions are unitary operators, reversible and entropy preserving, like lossless codecs in a computer network.

Axiom (2) describes the creation of an internet between two quantum networks. The number of links in the new network is the product of the sources in the constituent networks since each source in one of the product networks has access to all the sources in the other.

Axiom (3) describes the evolution of network traffic subject to the constraint that total traffic in a particular network is constant and normalized to 1. If traffic on one channel increases, it must decrease on another. This situation is a consequence of linearity of quantum mechanics and the conservation of energy in a quantum network, since the frequency of communication is measured by energy.

The three mathematical axioms above describe a system which is not directly observable and therefore to some extent hypothetical, to be supported by its observable consequences. The mathematical system described is not quantized but evolves according to the Schrödinger equation, continuously and unitarily as described by axiom (3). Observation occurs when two quantum systems communicate, so that they become correlated. In laboratory systems one system is usually designated as the observer and the other as the observed, but in nature quantum systems are continually "observing" one another to produce the classical world. Unitarity is broken by observation and entropy is increased. The dynamic quantum system is stopped to yield fixed particles which are at least momentarily eternal. This situation is described by axiom (5). von Neumann, op. cit. chapter 5

Axiom (5) introduces the idea that what we see when we observe a quantum system depends on the operator we use to look at it. The results we obtain are restricted to the orthonormal eigenstates of the measurement operator. This introduces quantization. Such quantization appears necessary to enable the error free transmission of information from one quantum system to another. Zurek (ref above)

Axiom (4) attests to the robustness of the observed fixed points. From the network point of view, these fixed points represent the halted states of the computers that are represented by the eigenfunctions of quantum mechanics.

Axiom (6) establishes that the statistics of a quantum observable are constrained by the same normalization that we find in the mathematical description of communication sources. Communication theory requires for a source A that the probabilities pi for the emission of letters ai of the source alphabet be normalized so that Σi pi = 1.

Historically, the first three postulates of quantum mechanics have been considered uncontroversial. but there has been endless debate about the interpretation of the mathematical formalism encapsulated in postulates (4) - (6). The paper by Zurek referred to above has clarified the situation slightly by showing that if we regard a quantum observation as an act of communication the mathematical postulates of quantum mechanics imply the observational postulates.

Modern scientific epistemology accepts Einstein's view that we can trust only knowledge obtained by direct contact with the entity we wish to know. The foundations of physical knowledge are observed events. Heisenberg sought to free quantum mechanics from classical misconceptions by insisting that only the phenomena need be explained; a theory has no standing except insofar as it does this (or at least promises to do it). Werner Heisenberg

The success of continuous formalism does not therefore guarantee that the Universe itself is continuous. In practical physics all our computations are implemented logically and digitally, and our digital approximations to continuous systems are limited only by the computing resources available. Even in the current state of the art they are comparable to the precision of any practical experiment.

Much of the literature of quantum mechanics speaks of 'wave-particle' duality. This duality, however, is a 'broken'. We observe particles. We do not observe waves directly, but rather find periodic structures (suggestive of waves) in repeated observations of certain systems like the paradigmatic two slit experiment. This wavelike structure is reflected in the probabilistic interpretation of the complex quantum amplitudes used in the mathematical formalism of quantum mechanics.

The theory of computation is also cyclic, recursive or wavelike. The power of a computer lies in its ability to perform very simple operations repetitively at great speed. The fundamental operator in a practical computer is the clock, which in effect implements the logical operation not, where tick = not-tock. The clock pulses serve to order the operations of the computer.

Unlike classical physics, which (at least in theory) predicts exactly what is going to happen in a given situation, quantum mechanics behaves more like a roulette wheel. While it is spinning, all its states are a superposed blur. When the wheel stops, the ball will fall on a definite number, but until it stops, we do not know which number it will be. In a fair game of roulette, the probabilities of all the numbers are equal. In the quantum mechanical case, while the outcomes are definite, their probabilities may be different. The theory enables us to predict the probability of each outcome, but not the actual outcome in each instance. In this respect, a quantum system is very like a communication source: we may know the alphabet of the source and the probability of the various letters, but we cannot predict exactly what the source is going to say next.

We are proposing that the transfinite computer network serves as a model for the interactions of the fixed points in the divine Universe. Up to this point in the discussion, we have been describing a formal structure constructed according to the rules of formalist mathematics: all we require is consistency. The only boundaries recognised on consistent mathematics are those discovered by Gödel and Turing where completeness and computability give way to incompleteness and incomputability.

Quantum mechanics is indifferent to the complexity of the space in which it operates. It treats a two dimensional system just like an infinite dimensional system. This means that it operates indifferently at all levels of complexity. Since the number of fixed points in a quantum system is equal to the dimension of the function space it occupies, and there is no limit on the complexity of these space, there is a natural connection between the transfinite numbers and the fixed points of quantum systems.

The power of order to create complexity, formalized by Cantor, gives us a mathematical window on the creative power embodied in the divine Universe. It also provides us with some insight also into the unlimited power of our imaginations, since we can imagine unlimited orderings of all the species of events that occur around us. One does not have to be with children for long to discover that they are endlessly creative, coming up with new ways to use (and misuse) every element of their environment.

The transfinite hierarchy of function spaces developed by Cantor provide the formal symbols for a universal computer network. It enables large symbols to be compiled from smaller symbols, that is large actions from a sequence of small actions. I, for instance, am a event (from birth to death) comprising some 1064 processing steps, each step involving one quantum of action. This number is very great because my mass is equivalent to a large amount of energy, and quantum of action is exceedingly small, about 7 × 10-34 Joule seconds.

Here we come to a problem. The mathematics used to describe wave functions is continuous. The world itself, on the other hand is discrete. There is an atomic action, measured by Planck's constant. Fundamental particles are discrete, as are atoms, molecules, trees, people, stars, leaves and everything else we can see. The illusion of continuity arises because the actions are so small that special instruments are necessary to observe them. This difficulty was solved mathematically by John von Neumann who proposed that the proper mathematical foundation for quantum mechanics is an abstract complex Hilbert space. John von Neumann (2014): Mathematical Foundations of Quantum Mechanics

The period from Planck's discovery of the quantum of action in 1900 until 1925 when Heisenberg, Born and Jordan formulated matrix mechanics was known as "the old quantum mechanics" dominated by Bohr's explanation of the spectrum of hydrogen in terms of orbiting electrons. He assumed that the hydrogen electrons moved in orbits whose angular momentum was an integral number of quanta. Bohr's idea attributed the frequencies of the photons emitted and absorbed by hydrogen depended on the actual orbital frequencies of the electrons. Matrix mechanics showed that the energy of the photons did not depend directly on the energy of the imagined orbits of electrons, on the differences between the energies of different electronic states. The discovery of the uncertainty principle showed that it did not make sense to attribute definite orbits to electrons or definite space-time trajectories to very small particles. Matrix mechanics - Wikipedia

Quantum mechanics began as a child of classical mechanics, but ran into so many difficulties that Heisenberg determined to make a clean start, concentrating only on observable quantities, the frequencies and intensities of atomic radiation. Born realized that the mathematical forms behind Heisenberg's work was matrices and it soon became clear that matrices were the key to quantum mechanics. Soon after Heisenberg's work, Erwin Schrödinger, inspired by de Broglie's idea that all fundamental particles were associated with waves, found a linear partial differential equation that modelled the time evolution of a quantum state. Schrödinger showed that his approach was equivalent to matrix mechanics. A little later, Paul Dirac was able to combine both approaches in his transformation theory and John von Neumann showed that the mathematics of linear operators (represented by matrices) provided a consistent mathematical formalism for quantum mechanics. The measurement problem remained unsolved, however. Paul Dirac (1983): The Principles of Quantum Mechanics (4th ed)

Back to top

5.11: Complementarity: measurement and meaning

A measurement is a count of some sort. We measure the number of people in a country by a census, counting them. We measure the length of a trip by counting the kilometres. We measure the mass of a baby by counting the grams. The meaning of the measurement depends on how we get the count. The census tells us how many people there are in a country and where they live because census takers round to every dwelling to report how many people sleep there on census night.

The simplest measurements are the source of physics. A physicist, measuring a Bible, might tell us that it is 50 millimetres thick and has a mass of two kilograms. Never mind that it is considered one of the most meaningful books ever published. That is irrelevant to a physicist (and the post office). In general, physical observations are a matter of binning and counting. The gigantic measuring instruments in the Large Hadron Collider are elaborate structures of electric and magnetic fields and sensors designed to separate the particles arising from an event into a separate bins of charge, mass, momentum and so on and counting the number of particles in each bin. The physicists then use their mathematical models of the physical world to give meaning to all these counts. CERN: ATLAS Detector

Our eyes work in a similar way. The lens at the front of each eye projects an image onto the retina which comprises about 100 million light sensitive receptors each of which samples a pixel of the retinal image. The receptors are the the bins of the measuring process, and each one counts how frequently photons of light fall upon it. Some, the 'cones', are sensitive to colour and separate colours into three bins. Other, the 'rods', detect light of all colours. Like the census, the position of the receptor is also significant. Beginning in the retina itself, this raw information goes through a number of layers of processing to yield the visual awareness that enters our consciousness. Things are said to be complementary is one adds meaning to another. So all our visual receptors are complementary, as are all the neurons in the image processing circuits of our visual system Visual system - Wikipedia

Quantum mechanics tells us that each measurement is accompanied by at least one quantum of action. Action serves as the 'universal' bin: every observation is an action. The nature of the actions is something else, and here we expect to find a countable infinity of bins each corresponding to a computable algorithm used by the Universe to reveal the observed particle and its properties.

The meaning of each action depends upon where it lies in the space of actions with which it is communicating. By observing large numbers of atomic events within ourselves, we gather the information necessary to form the mental images we use to guide our activities in the world.

Act is the unmeasured measure. Since the time of Aristotle, given the hypothesis the universe is divine, an act is divine, God. Although we imagine God as the whole of reality and the quantum of action is exceedingly small, because there is no relevant metric they can both be identified simply as pure action. Every quantum of action is independent, but they are connected as elements of a logical continuum to create the universe. The ancient mystery of the Trinity is the first theological expression of this multiplicity within singularity. Aquinas, Summa, I, 27, 1: Is there procession in God?

Back to top

5.12: Visibility and invisibility

The traditional God is a mysterious other, completely invisible to us. Long ago a few people were inspired to write the Old Testament of the Bible upon which the Catholic Church is based. Later, God themself is believed to have appeared among us in the person of Jesus of Nazareth. The Church holds that the story of the divine Jesus recorded in the new Testament completed God's revelation of themself self to us. About a thousand pages of print in all.

The hypothesis that the Universe is divine reveals infinitely more information about God but much remains invisible and uncertain. We can see many of the fixed points in God, but the dynamics remain hidden and a matter for speculation, as we see in quantum mechanics. This is perhaps most obvious in the field of human relations, where it is often very hard to be certain what someone is thinking. These problems are probably most intense in the areas of love and politics. Do you love me? How can I make you love me? Are you really on my side? Why did you stab me in the back, etc etc .

A similar level of uncertainty exists at all other scales, greater and smaller that the human individual. One of the most surprising discoveries of twentieth century physics is the uncertainty principle which holds at the quantum mechanical level in the physical network, our most fundamental theory of the Universe. This uncertainty arises because the observable universe is pixellated in units of the quantum of action. As in any pixellated image, we see no detail inside each pixel, only a plain colour.

Until the advent of quantum mechanics, physicists were generally inclined to believe that the world was deterministic. They still attribute determinism to the invisible process described by the wave equation that underlies quantum observations, but they now have to accept that even though the hidden quantum processes may be deterministic, they do not determine when things happen. The nature of the actual events is well defined, so we can make quantum clocks that keep time to within a second in 100 million years. The long run frequency of quantum events can accurately predicted by the Born rule, but individual quantum events occur at random. This God plays dice. The numbers come up exactly, but we cannot tell exactly which one it will be at a given time. This led Einstein to feel that qauntum mechanics is an incomplete and provisonal theory. Atomic clock - Wikipedia

The precision in quantum events arises because they are measured by Planck's constant, a quantity, like the speed of light, which appears to be precisely fixed in nature. When an electron moves from one state to another in an atom it emits or absorbs a photon with one quantum of angular momentum and the electron involved changes its angular momentum by one unit in the opposite direction because action is conserved. Using quantum electrodynamics, we can sometimes compute the energy change associated with this transition to many decimal places, and there is no reason to suspect that it is not an exact fixed point in nature. Photon - Wikipedia, Quantum electrodynamics - Wikipedia

A second source of uncertainty is invisibility. We cannot see the underlying quantum mechanical process involved in the atomic emission and absorption of photons, for instance, so we our knowledge of this is speculative rather than observational.

Why can't we see the mechanism that yields these results? Here we are proposing that the Universe is digital 'to the core'. We understand this by analogy with computer networks like the internet, and there we find an explanation for the invisibility of processes and the visibility of the results of processes. We asume that the observable fixed points in the Universe are the results of computations and that the invisible dynamic of the Universe are executed by invisible computers. We suspect the presence of deterministic digital computers because of the precision with which nature determines the eigenvalues of various observations,.

In everyday life we use networks through a user interfaces which might be in a computer or a phone. The interface enables us to transmit and receive data to from and the network. Behind the interfaces is the system which transmits data from one interface to another, the coding and switching network.

The work that goes on between the user interfaces is invisible or transparent to us. We do not become aware of it unless it breaks down and we need to understand how to fix it. From a physical point of view, the user interface of the world is the space-time in which we live. The messages we receive from the Universe are written in spacetime, and we act in spacetime to send messages back to the Universe.

The processing behind the scenes is invisible to us because a computer cannot both do something and describe to a bystander everything that it is doing. The reason for this is that for a computer there is no difference between computation and communication. The purpose of a communication to transmit information from one point to another. A computer may not move through space, but it does move through time as it executes its software.

If a computer is to explain what it is doing, it must stop that task and turn to the communication task, another computation. This computation must also be communicated, requiring another computation, and so on without end, so that the initial task will never be completed.

We can see this feature of a communication network at work in the classical quantum mechanical two slit experiment. When we transmit particles through a barrier with two slits and do not check which slit the particle goes through, we get an interference pattern. If we make a measurement to check which slit the particle goes through, however, we lose the interference pattern. Our observation has the effect of stopping the interference process before it is complete, so that there is no interference. So we cannot both have our process and observe it.

A third source of invisiblity is symmetry. A snowflake is symmetrical, with six identical 'arms'. Because they are identical we cannot tell which is which. If we look away and someone turns the snowflake around, we have no way of telling how far it was turned or if it is was turned all all.

Traditional theology holds that God has no structure:

When the existence of a thing has been ascertained there remains the further question of the manner of its existence, in order that we may know its essence. Now, because we cannot know what God is, but rather what they are not, we have no means for considering how God is, but rather how they are not. . . .

Now it can be shown how God is not, by denying them whatever is opposed to the idea of them, viz. composition, motion, and the like. Aquinas Summa I, 3, 1 Proemium

This is the famous via negativa.

Symmetries are situations where nothing observable happens. They are the practical boundaries of the dynamic Universe. We may picture this to a degree by imagining the string of a piano or guitar. When struck, the string vibrates at every point except at the two ends, which are held still by the structure of the instrument. Symmetry - Wikipedia

When we consider the Universe as divine, we can imagine the symmetries discovered by physics as the boundaries of the divinity. From a logical point of view, the dynamics of the Universe is consistent. The boundaries of the dynamics are the points beyond which it would become inconsistent, that is non-existent,.

All our experience is experience of God, and all our experiences are in effect measurements of God, that is events that we see as fixed points in the divine dynamics. We can learn a lot more about the natural God than those can learn about their God who only read the Bible. Most theology is written in the Book of Nature. Although the natural God is only partially visible we are continually in contact with it so that we have a good chance of learning how it works. Such knowledge is necessary both for survival and for its outstanding contribution to the spiritual quality of our lives. Spirituality - Wikipedia

Back to top

5.13: Network intelligence: collective mind

Intelligence has two common meanings. The first is simply the collection of data. So intelligence services use spies, wiretaps, interrogation and other methods to collect information about their targets, who are usually cast as potential troublemakers either inside or outside the body politic, particularly those who try to make government accountable. Australia–East Timor spying scandal - Wikipedia

The second is the interpretation of data, the more common meaning of intelligence. In the spying trade, this may mean decoding intercepted coded messages to get a plain text, then decoding the plain text to understand what it means. In normal life all transmissions of information need encoding and decoding, as we have discussed in chapter 2, on language. The only exception to this may be gravitation - see Chapter 6. When we are using our native languages, this process seems to be instantaneous, but we are often faced with more complex data that take a long time to decode.

Networks speed up both these forms of intelligence. Data collection is improved if there are many collectors collaborating on the same task and sharing their results. Our eyes, for instance, use a network of about 100 million receptors in the retina of each eye. Each of these receptors samples a pixel of the retinal image, and the data is then collected, compressed and transmitted to our brains.

Data processing or decoding is speeded up by the power of a network to process data in parallel. Many processors working on the same problem are quite likely to speed up the discovery of a solution. We this phenomenon in the science industry. More scientists collect more data and apply more minds to its interpretation, speeding up the process of understanding the world and developing technologies to exploit our understanding. The large number of laboratories around the world working to develop a vaccine for the covid-19 virus is an example of this approach. There is a global network of vaccine developers.

In the political realm, network communications may be a mixed blessing. On the one hand the fact than almost anybody can spill the beans on anybody else can serve as a powerful weapon against corruption. Networks can also work in the opposite direction, speeding up the generation and propagation of fiction.We have learnt that a large proportion of network users place more value on sensation and conspiracy than truth. The general standard of comment seems to be pretty low and people who might be inclined to be polite in face to face encounters hide behind anonymity to reveal their true characters - a valuable revelation.

The Catholic Church believes it has been mandated by its God to spread its story to everybody, and it goes about this task aggressively through its social roles in welfare, education and politics. This work has been very effective, so that the doctrinal errors embedded in the Church are very well established and difficult to disrupt. In the long run, however, the discrepancies between Catholic propaganda and reality will become more obvious as the alternative becomes more acceptable. Christian theology will eventually escape from the political control which locked it into a fictional world 1600 years ago when it sold its soul to the Roman Empire and became a warmongering political entity like its imperial sponsors. Constantine the Great and Christianity - Wikipedia

(revised 4 July 2021)

Back to top

Back to table of contents

Copyright:

You may copy this material freely provided only that you quote fairly and provide a link (or reference) to your source.

Further reading

Books

Ashby, W Ross, An Introduction to Cybernetics, Methuen 1956, 1964 'This book is intended to provide [an introduction to cybernetics]. It starts from common-place and well understood concepts, and proceeds step by step to show how these concepts can be made exact, and how they can be developed until they lead into such subjects as feedback, stability, regulation, ultrastability, information, coding, noise and other cybernetic topics.' 
Amazon
  back

Cantor, Georg, Contributions to the Founding of the Theory of Transfinite Numbers (Translated, with Introduction and Notes by Philip E B Jourdain), Dover 1895, 1897, 1955 Jacket: 'One of the greatest mathematical classics of all time, this work established a new field of mathematics which was to be of incalculable importance in topology, number theory, analysis, theory of functions, etc, as well as the entire field of modern logic.' 
Amazon
  back

Cantor (1897, 1955), Georg, Contributions to the Founding of the Theory of Transfinite Numbers (Translated, with Introduction and Notes by Philip E B Jourdain), Dover 1895, 1897, 1955 Jacket: 'One of the greatest mathematical classics of all time, this work established a new field of mathematics which was to be of incalculable importance in topology, number theory, analysis, theory of functions, etc, as well as the entire field of modern logic.' 
Amazon
  back

Cohen (1980), Paul J, Set Theory and the Continuum Hypothesis, Benjamin/Cummings 1966-1980 Preface: 'The notes that follow are based on a course given at Harvard University, Spring 1965. The main objective was to give the proof of the independence of the continuum hypothesis [from the Zermelo-Fraenkel axioms for set theory with the axiom of choice included]. To keep the course as self contained as possible we included background materials in logic and axiomatic set theory as well as an account of Gödel's proof of the consistency of the continuum hypothesis. . . .'  
Amazon
  back

Dirac (1983), P A M, The Principles of Quantum Mechanics (4th ed), Oxford UP/Clarendon 1983 Jacket: '[this] is the standard work in the fundamental principles of quantum mechanics, indispensible both to the advanced student and the mature research worker, who will always find it a fresh source of knowledge and stimulation.' (Nature)  
Amazon
  back

Feynman, Richard, Feynman Lectures on Computation, Perseus Publishing 2007 Amazon Editorial Reviews Book Description 'The famous physicist's timeless lectures on the promise and limitations of computers When, in 1984-86, Richard P. Feynman gave his famous course on computation at the California Institute of Technology, he asked Tony Hey to adapt his lecture notes into a book. Although led by Feynman, the course also featured, as occasional guest speakers, some of the most brilliant men in science at that time, including Marvin Minsky, Charles Bennett, and John Hopfield. Although the lectures are now thirteen years old, most of the material is timeless and presents a "Feynmanesque" overview of many standard and some not-so-standard topics in computer science such as reversible logic gates and quantum computers.'  
Amazon
  back

Feynman (1988), Richard, QED: The Strange Story of Light and Matter, Princeton UP 1988 Jacket: 'Quantum electrodynamics - or QED for short - is the 'strange theory' that explains how light and electrons interact. Thanks to Richard Feynmann and his colleagues, it is also one of the rare parts of physics that is known for sure, a theory that has stood the test of time. . . . In this beautifully lucid set of lectures he provides a definitive introduction to QED.' 
Amazon
  back

Hallett (1984), Michael, Cantorian Set Theory and Limitation of Size, Oxford UP 1984 Jacket: 'This book will be of use to a wide audience, from beginning students of set theory (who can gain from it a sense of how the subject reached its present form), to mathematical set theorists (who will find an expert guide to the early literature), and for anyone concerned with the philosophy of mathematics (who will be interested by the extensive and perceptive discussion of the set concept).' Daniel Isaacson. 
Amazon
  back

Hawking (1975), Steven W, and G F R Ellis, The Large Scale Structure of Space-Time, Cambridge UP 1975 Preface: Einstein's General Theory of Relativity . . . leads to two remarkable predictions about the universe: first that the final fate of massive stars is to collapse behind an event horizon to form a 'black hole' which will contain a singularity; and secondly that there is a singularity in our past which constitutes, in some sense, a beginning to our universe. Our discussion is principally aimed at developing these two results.' 
Amazon
  back

Hodges (1983), Andrew, Alan Turing: The Enigma, Burnett 1983 Author's note: '. . . modern papers often employ the usage turing machine. Sinking without a capital letter into the collective mathematical consciousness (as with the abelian group, or the riemannian manifold) is probably the best that science can offer in the way of canonisation.' (530) 
Amazon
  back

Jech, Thomas, Set Theory, Springer 1997 Jacket: 'This book covers major areas of modern set theory: cardinal arithmetic, constructible sets, forcing and Boolean-valued models, large cardinals and descriptive set theory. . . . It can be used as a textbook for a graduate course in set theory and can serve as a reference book.' 
Amazon
  back

Jech (1997), Thomas, Set Theory, Springer 1997 Jacket: 'This book covers major areas of modern set theory: cardinal arithmetic, constructible sets, forcing and Boolean-valued models, large cardinals and descriptive set theory. . . . It can be used as a textbook for a graduate course in set theory and can serve as a reference book.' 
Amazon
  back

Klein, Richard G, The Human Career: Human Biological and Cultural Origins, University of Chicago Press 2009 ' Since its publication in 1989, The Human Career has proved to be an indispensable tool in teaching human origins. This substantially revised third edition retains Richard G. Klein’s innovative approach while showing how cumulative discoveries and analyses over the past ten years have significantly refined our knowledge of human evolution. . . . In addition to outlining the broad pattern of human evolution, The Human Career details the kinds of data that support it. For the third edition, Klein has added numerous tables and a fresh citation system designed to enhance readability, especially for students. He has also included more than fifty new illustrations to help lay readers grasp the fossils, artifacts, and other discoveries on which specialists rely. With abundant references and hundreds of images, charts, and diagrams, this new edition is unparalleled in its usefulness for teaching human evolution.' 
Amazon
  back

Nielsen, Michael A, and Isaac L Chuang, Quantum Computation and Quantum Information, Cambridge University Press 2000 Review: A rigorous, comprehensive text on quantum information is timely. The study of quantum information and computation represents a particularly direct route to understanding quantum mechanics. Unlike the traditional route to quantum mechanics via Schroedinger's equation and the hydrogen atom, the study of quantum information requires no calculus, merely a knowledge of complex numbers and matrix multiplication. In addition, quantum information processing gives direct access to the traditionally advanced topics of measurement of quantum systems and decoherence.' Seth Lloyd, Department of Quantum Mechanical Engineering, MIT, Nature 6876: vol 416 page 19, 7 March 2002. 
Amazon
  back

Russell, Bertrand, The Principles of Mathematics, W W Norton & Co 1903, 1938, 1996 Amazon Product Description 'Russell's classic The Principles of Mathematics sets forth his landmark thesis that mathematics and logic are identical—that what is commonly called mathematics is simply later deductions from logical premises. His ideas have had a profound influence on twentieth-century work on logic and the foundations of mathematics.' 
Amazon
  back

Tanenbaum (1996), Andrew S, Computer Networks, Prentice Hall International 1996 Preface: 'The key to designing a computer network was first enunciated by Julius Caesar: Divide and Conquer. The idea is to design a network as a sequence of layers, or abstract machines, each one based upon the previous one. . . . This book uses a model in which networks are divided into seven layers. The structure of the book follows the structure of the model to a considerable extent.'  
Amazon
  back

von Neumann, John, and Robert T Beyer (translator), Mathematical Foundations of Quantum Mechanics, Princeton University Press 1983 Jacket: '. . . a revolutionary book that caused a sea change in theoretical physics. . . . JvN begins by presenting the theory of Hermitean operators and Hilbert spaces. These provide the framework for transformation theory, which JvN regards as the definitive form of quantum mechanics. . . . Regarded as a tour de force at the time of its publication, this book is still indispensable for those interested in the fundamental issues of quantum mechanics.' 
Amazon
  back

Wilczek, Frank, The Lightness of Being: Mass, Ether, and the Unification of Forces, Basic Books 2008 ' In this excursion to the outer limits of particle physics, Wilczek explores what quarks and gluons, which compose protons and neutrons, reveal about the manifestation of mass and gravity. A corecipient of the 2004 Nobel Prize in Physics, Wilczek knows what he’s writing about; the question is, will general science readers? Happily, they know what the strong interaction is (the forces that bind the nucleus), and in Wilczek, they have a jovial guide who adheres to trade publishing’s belief that a successful physics title will not include too many equations. Despite this injunction (against which he lightly protests), Wilczek delivers an approachable verbal picture of what quarks and gluons are doing inside a proton that gives rise to mass and, hence, gravity. Casting the light-speed lives of quarks against “the Grid,” Wilczek’s term for the vacuum that theoretically seethes with quantum activity, Wilczek exudes a contagious excitement for discovery. A near-obligatory acquisition for circulating physics collections.' --Gilbert Taylor  
Amazon
  back

Zemanian, Armen H, Transfiniteness for Graphs, Electrical Newtorks and Random Walks, Springer Verlag 1996 'A substantial introduction is followed by chapters covering transfinite graphs; connectedness problems; finitely structured transfinite graphs; transfinite electrical networks; permissively finitely structured networks; and a theory for random walks on a finitely structured transfinite network. Appendices present brief surveys of ordinal and cardinal numbers; summable series; and irreducible and reversible Markov chains. Accessible to those familiar with basic ideas about graphs, Hilbert spaces, and resistive electrical networks. (Annotation copyright Book News, Inc. Portland, Or.)'  
Amazon
  back

Papers

Salart (2008), Daniel, et al, "Testing the speed of 'spooky action at a distance'", Nature, 454, , 14 August 2008, page 861-864. 'Correlations are generally described by one of two mechanisms: either a first event influences a second one by sending information encoded in bosons or other physical carriers, or the correlated events have some common causes in their shared history. Quantum physics predicts an entirely different kind of cause for some correlations, named entanglement. This reveals itself in correlations that violate Bell inequalities (implying that they cannot be described by common causes) between space-like separated events (implying that they cannot be described by classical communication). Many Bell tests have been performed, and loopholes related to locality and detection have been closed in several independent experiments. It is still possible that a first event could influence a second, but the speed of this hypothetical influence (Einstein's 'spooky action at a distance') would need to be defined in some universal privileged reference frame and be greater than the speed of light. Here we put stringent experimental bounds on the speed of all such hypothetical influences. We performed a Bell test over more than 24 hours between two villages separated by 18 km and approximately east–west oriented, with the source located precisely in the middle. We continuously observed two-photon interferences well above the Bell inequality threshold. Taking advantage of the Earth's rotation, the configuration of our experiment allowed us to determine, for any hypothetically privileged frame, a lower bound for the speed of the influence. For example, if such a privileged reference frame exists and is such that the Earth's speed in this frame is less than 10-3 times that of the speed of light, then the speed of the influence would have to exceed that of light by at least four orders of magnitude.. back

Links

Alan Turing, On Computable Numbers, with an application to the Entscheidungsproblem, 'The "computable" numbers may be described briefly as the real numbers whose expressions as a decimal are calculable by some finite means. Although the subject of this paper is ostensibly the computable numbers, it is almost equally easy to define and investigate computable functions of an integral variable of a real or computable variable, computable predicates and so forth. . . . ' back

Algorithmic information theory - Wikipedia, Algorithmic information theory - Wikipedia, the free encyclopedia, 'Algorithmic information theory is a subfield of information theory and computer science that concerns itself with the relationship between computation and information. According to Gregory Chaitin, it is "the result of putting Shannon's information theory and Turing's computability theory into a cocktail shaker and shaking vigorously".' back

Angels in Judaism - Wikipedia, Angels in Judaism - Wikipedia, the free encyclopedia, 'In Judaism an angel (Hebrew: מַלְאָךְ‎‎ malakh, plural malakhim) is a messenger of God, an angelic envoy or an angel in general who appears throughout the Hebrew Bible, Rabbinic literature, and traditional Jewish liturgy. Angels in Judaism are categorized in different hierarchies.' back

Aquinas Summa I, 3, 1 Proemium, Of the Simplicity of God, 'Cognito de aliquo an sit, inquirendum restat quomodo sit, ut sciatur de eo quid sit. Sed quia de Deo scire non possumus quid sit, sed quid non sit, non possumus considerare de Deo quomodo sit, sed potius quomodo non sit. . . . Potest autem ostendi de Deo quomodo non sit, removendo ab eo ea quae ei non conveniunt, utpote compositionem, motum, et alia huiusmodi. Primo ergo inquiratur de simplicitate ipsius, per quam removetur ab eo compositio.' back

Aquinas, Summa, I, 27, 1, Is there procession in God?, 'As God is above all things, we should understand what is said of God, not according to the mode of the lowest creatures, namely bodies, but from the similitude of the highest creatures, the intellectual substances; while even the similitudes derived from these fall short in the representation of divine objects. Procession, therefore, is not to be understood from what it is in bodies, either according to local movement or by way of a cause proceeding forth to its exterior effect, as, for instance, like heat from the agent to the thing made hot. Rather it is to be understood by way of an intelligible emanation, for example, of the intelligible word which proceeds from the speaker, yet remains in him. In that sense the Catholic Faith understands procession as existing in God.' back

Aquinas, Summa: I, 7, 1, Is God infinite?, 'Since therefore the divine being is not a being received in anything, but He is His own subsistent being . . . it is clear that God Himself is infinite and perfect.' back

Aristotle, Prior Analytics, 'We must first state the subject of our inquiry and the faculty to which it belongs: its subject is demonstration and the faculty that carries it out demonstrative science. We must next define a premiss, a term, and a syllogism, and the nature of a perfect and of an imperfect syllogism; and after that, the inclusion or noninclusion of one term in another as in a whole, and what we mean by predicating one term of all, or none, of another.' back

Aristotle (continuity), Physics V, iii, , ' A thing that is in succession and touches is 'contiguous'. The 'continuous' is a subdivision of the contiguous: things are called continuous when the touching limits of each become one and the same and are, as the word implies, contained in each other: continuity is impossible if these extremities are two. This definition makes it plain that continuity belongs to things that naturally in virtue of their mutual contact form a unity. And in whatever way that which holds them together is one, so too will the whole be one, e.g. by a rivet or glue or contact or organic union. ' 227a10 sqq back

Atomic clock - Wikipedia, Atomic clock - Wikipedia, the free encyclopedia, 'An atomic clock is a clock device that uses an electronic transition frequency in the microwave, optical, or ultraviolet region of the electromagnetic spectrum of atoms as a frequency standard for its timekeeping element.' back

Australia–East Timor spying scandal - Wikipedia, Australia–East Timor spying scandal - Wikipedia, the free encyclopedia, 'The Australia–East Timor spying scandal began in 2004 when the Australian Secret Intelligence Service (ASIS) clandestinely planted covert listening devices in a room adjacent to the East Timor (Timor-Leste) Prime Minister's Office at Dili, to obtain information in order to ensure Australia held the upper hand in negotiations with East Timor over the rich oil and gas fields in the Timor Gap. . . . When the espionage became known, East Timor rejected the Timor Sea treaty, and referred the matter to the International Court of Justice (ICJ) in The Hague. Timor's lawyers, including Bernard Collaery, intended to call Witness K as a confidential witness in an 'in camera' hearing in March 2014. However, in December 2013 the homes and office of both Witness K and his lawyer Bernard Collaery were raided and searched by ASIO and Australian Federal Police, and many legal documents were confiscated. East Timor immediately sought an order from the ICJ for the sealing and return of the documents. In March 2014, the ICJ ordered Australia to stop spying on East Timor. The Permanent Court of Arbitration in The Hague considered claims by East Timor over the territory until early 2017, when East Timor dropped the ICJ case against Australia after the Australian Government agreed to renegotiate. In 2018, the parties signed a new agreement which gave 80% of the profits to East Timor and 20% to Australia.' back

Australian Intelligence Community - Wikipedia, Australian Intelligence Community - Wikipedia, the free encyclopedia, 'The National Security Committee (NSC) of Cabinet is a Cabinet committee and the peak ministerial decision-making body on national security, intelligence and defence matters. It is chaired by the Prime Minister and the membership includes the Deputy Prime Minister, Attorney-General, Treasurer, Minister for Foreign Affairs, Minister for Defence, and the ministerial Cabinet Secretary.' back

Bandwidth (computing) - Wikipedia, Bandwidth (computing) - Wikipedia, the free encyclopedia, ' In computer networking and computer science, bandwidth, network bandwidth, data bandwidth, or digital bandwidth is a measure of available or consumed data communication resources expressed in bits/second or multiples of it (kilobits/s, megabits/s etc.).' back

Beth Blaxland & Fran Dorey, The first migrations out of Africa, ' The oldest known Homo sapiens fossils outside of Africa come from caves in Israel - Misliya (about 180,000 years old), Skhul (about 90,000 years old) and Qafzeh (about 120,000 years old). These probably represent populations that intermittently occupied the region and it is unlikely that there was direct evolutionary continuity between the Misliya and later Skhul/Qafzeh peoples. Genetic studies also support the idea of earlier dispersals of modern humans out of Africa starting from about 220,000 years ago.' back

Bombe - Wikipedia, Bombe - Wikipedia, the free encyclopedia, 'The bombe was an electromechanical device used by British cryptologists to help decipher German Enigma-machine-encrypted secret messages during World War II.The US Navy and US Army later produced their own machines to the same functional specification, but engineered differently from each other and from the British Bombe.' back

Born rule - Wikipedia, Born rule - Wikipedia, the free encyclopedia, 'The Born rule (also called the Born law, Born's rule, or Born's law) is a law of quantum mechanics which gives the probability that a measurement on a quantum system will yield a given result. It is named after its originator, the physicist Max Born. The Born rule is one of the key principles of the Copenhagen interpretation of quantum mechanics. There have been many attempts to derive the Born rule from the other assumptions of quantum mechanics, with inconclusive results. . . . The Born rule states that if an observable corresponding to a Hermitian operator A with discrete spectrum is measured in a system with normalized wave function (see bra-ket notation), then the measured result will be one of the eigenvalues λ of A, and the probability of measuring a given eigenvalue λi will equal <ψ|Pi|ψ> where Pi is the projection onto the eigenspace of A corresponding to λi'. back

Cantor's paradox - Wikipedia, Cantor's paradox - Wikipedia, the free encyclopedia, 'In set theory, Cantor's paradox is derivable from the theorem that there is no greatest cardinal number, so that the collection of "infinite sizes" is itself infinite. The difficulty is handled in axiomatic set theory by declaring that this collection is not a set but a proper class; in von Neumann–Bernays–Gödel set theory it follows from this and the axiom of limitation of size that this proper class must be in bijection with the class of all sets. Thus, not only are there infinitely many infinities, but this infinity is larger than any of the infinities it enumerates.' back

Cantor's theorem - Wikipedia, Cantor's theorem - Wikipedia, the free encyclopedia, ' In elementary set theory, Cantor's theorem is a fundamental result which states that, for any set A, the set of all subsets of A (the power set of A, denoted by P(A) ) has a strictly greater cardinality than A itself. For finite sets, Cantor's theorem can be seen to be true by simple enumeration of the number of subsets. Counting the empty set as a subset, a set with n members has a total of 2n subsets, so that if card (A) = n, then card (P(A)) = 2n, and the theorem holds because 2n > n for all non-negative integers.' back

Cardinal number - Wikipedia, Cardinal number - Wikipedia, the free encyclopedia, 'In mathematics, cardinal numbers, or cardinals for short, are a generalization of the natural numbers used to measure the cardinality (size) of sets. The cardinality of a finite set is a natural number: the number of elements in the set. The transfinite cardinal numbers describe the sizes of infinite sets.' back

Cardinality of the continuum - Wikipedia, Cardinality of the continuum - Wikipedia, the free encyclopedia, 'In mathematics, the cardinality of the continuum (sometimes also called the power of the continuum) is the cardinal number of the set of real numbers R (sometimes called the continuum). This cardinal number is often denoted by c, so c = R.' back

Carl Zimmer, A Single Migration From Africa Populated the World, Studies Find, 'Modern humans evolved in Africa roughly 200,000 years ago. But how did our species go on to populate the rest of the globe? . . . In a series of extraordinary genetic analyses published on Wednesday, researchers believe they have found an answer. In the journal Nature, three separate teams of geneticists survey DNA collected from cultures around the globe, many for the first time, and conclude that all non-Africans today trace their ancestry to a single population emerging from Africa between 50,000 and 80,000 years ago.' back

CERN, ATLAS Detector, 'Beams of particles from the LHC collide at the centre of the ATLAS detector making collision debris in the form of new particles, which fly out from the collision point in all directions. Six different detecting subsystems arranged in layers around the collision point record the paths, momentum, and energy of the particles, allowing them to be individually identified. A huge magnet system bends the paths of charged particles so that their momenta can be measured.' back

Claude E Shannon, A Mathematical Theory of Communication, 'The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages.' back

Claude Shannon, Communication in the Presence of Noise, 'A method is developed for representing any communication system geometrically. Messages and the corresponding signals are points in two “function spaces,” and the modulation process is a mapping of one space into the other. Using this representation, a number of results in communication theory are deduced concerning expansion and compression of bandwidth and the threshold effect. Formulas are found for the maximum rate of transmission of binary digits over a system when the signal is perturbed by various types of noise. Some of the properties of “ideal” systems which transmit at this maximum rate are discussed. The equivalent number of binary digits per second for certain information sources is calculated.' [C. E. Shannon , “Communication in the presence of noise,” Proc. IRE, vol. 37, pp. 10–21, Jan. 1949.] back

Claude Shannon - Wikipedia, Claude Shannon - Wikipedia, the free encyclopedia, 'Claude Elwood Shannon (April 30, 1916 – February 24, 2001), an American electrical engineer and mathematician, has been called "the father of information theory". Shannon is famous for having founded information theory and both digital computer and digital circuit design theory when he was 21 years-old by way of a master's thesis published in 1937, wherein he articulated that electrical application of Boolean algebra could construct and resolve any logical, numerical relationship. It has been claimed that this was the most important master's thesis of all time.' back

Coding theory - Wikipedia, Coding theory - Wikipedia, the free encyclopedia, 'Coding theory is the study of the properties of codes and their fitness for a specific application. Codes are used for data compression, cryptography, error-correction and more recently also for network coding. Codes are studied by various scientific disciplines—such as information theory, electrical engineering, mathematics, and computer science—for the purpose of designing efficient and reliable data transmission methods.' back

Colossus computer - Wikipedia, Colossus computer - Wikipedia - the free encyclopedia, 'Colossus was the name of a series of computers developed by British codebreakers in 1943-1945 to help in the cryptanalysis of the Lorenz cipher. Colossus used thermionic valves (vacuum tubes) and thyratrons to perform Boolean and counting operations. Colossus is thus regarded as the world's first programmable, electronic, digital computer, although it was programmed by plugs and switches and not by a stored program.' back

Computer - Wikipedia, Computer - Wikipedia, the free encyclopedia, ' A computer is a machine that can be instructed to carry out sequences of arithmetic or logical operations automatically via computer programming. Modern computers have the ability to follow generalized sets of operations, called programs. These programs enable computers to perform an extremely wide range of tasks. A "complete" computer including the hardware, the operating system (main software), and peripheral equipment required and used for "full" operation can be referred to as a computer system.' back

Computer network - Wikipedia, Computer network - Wikipedia the free encyclopedia, 'A computer network, or simply a network, is a collection of computers and network hardware interconnected by communication channels that allow sharing of resources and information. . . . The best known computer network is the Internet. . . . Computer networking can be considered a branch of electrical engineering, telecommunications, computer science, information technology or computer engineering, since it relies upon the theoretical and practical application of the related disciplines.' back

Constantine the Great and Christianity - Wikipedia, Constantine the Great and Christianity - Wikipedia, the free encyclopedia, ' During the reign of the Roman Emperor Constantine the Great (AD 306–337), Christianity began to transition to the dominant religion of the Roman Empire. Historians remain uncertain about Constantine's reasons for favoring Christianity, and theologians and historians have often argued about which form of early Christianity he subscribed to. . . . Constantine's decision to cease the persecution of Christians in the Roman Empire was a turning point for early Christianity, sometimes referred to as the Triumph of the Church, the Peace of the Church or the Constantinian shift. In 313, Constantine and Licinius issued the Edict of Milan decriminalizing Christian worship. The emperor became a great patron of the Church and set a precedent for the position of the Christian emperor within the Church and raised the notions of orthodoxy, Christendom, ecumenical councils, and the state church of the Roman Empire declared by edict in 380. He is revered as a saint and is apostolos in the Eastern Orthodox Church, Oriental Orthodox Church, and various Eastern Catholic Churches for his example as a "Christian monarch”.' back

Cursus publicus - Wikipedia, Cursus publicus - Wikipedia, the free encyclopedia, 'The cursus publicus (Latin: "the public way") was the state-run courier and transportation service of the Roman Empire, later inherited by the Byzantine Empire. The Emperor Augustus created it to transport messages, officials, and tax revenues between the provinces and Italy.' back

Cyclic group - Wikipedia, Cyclic group - Wikipedia, the free encyclopedia, ' In group theory, a branch of abstract algebra, a cyclic group or monogenous group is a group that is generated by a single element. That is, it is a set of invertible elements with a single associative binary operation, and it contains an element g such that every other element of the group may be obtained by repeatedly applying the group operation to g or its inverse. Each element can be written as a power of g in multiplicative notation, or as a multiple of gin additive notation. This element g is called a generator of the group.' back

David Hilbert - Wikipedia, David Hilbert - Wikipedia, the free encyclopedia, 'David Hilbert (January 23, 1862 – February 14, 1943) was a German mathematician, recognized as one of the most influential and universal mathematicians of the 19th and early 20th centuries. He invented or developed a broad range of fundamental ideas, in invariant theory, the axiomatization of geometry, and with the notion of Hilbert space, one of the foundations of functional analysis. Hilbert adopted and warmly defended Georg Cantor's set theory and transfinite numbers. A famous example of his leadership in mathematics is his 1900 presentation of a collection of problems that set the course for much of the mathematical research of the 20th century. Hilbert and his students supplied significant portions of the mathematical infrastructure required for quantum mechanics and general relativity. He is also known as one of the founders of proof theory, mathematical logic and the distinction between mathematics and metamathematics.' back

Differentiable manifold - Wikipedia, Differentiable manifold - Wikipedia, the free encyclopedia, ' In mathematics, a differentiable manifold is a type of manifold that is locally similar enough to a linear space to allow one to do calculus. Any manifold can be described by a collection of charts, also known as an atlas. One may then apply ideas from calculus while working within the individual charts, since each chart lies within a linear space to which the usual rules of calculus apply. If the charts are suitably compatible (namely, the transition from one chart to another is differentiable), then computations done in one chart are valid in any other differentiable chart.' back

Double-slit experiment - Wikipedia, Double-slit experiment - Wikipedia, the free encyclopedia, 'In the double-slit experiment, light is shone at a solid thin plate that has two slits cut into it. A photographic plate is set up to record what comes through those slits. One or the other slit may be open, or both may be open. . . . The most baffling part of this experiment comes when only one photon at a time is fired at the barrier with both slits open. The pattern of interference remains the same as can be seen if many photons are emitted one at a time and recorded on the same sheet of photographic film. The clear implication is that something with a wavelike nature passes simultaneously through both slits and interferes with itself — even though there is only one photon present. (The experiment works with electrons, atoms, and even some molecules too.)' back

Einstein, Podolsky and Rosen, Can the Quantum Mechanical Description of Physical Reality be Considered Complete?, A PDF of the classic paper. 'In a complete theory there is an element corresponding to each element of reality. A sufficient condition for the reality of a physical quantity is the possibility of predicting it with certainty, without disturbing the system. In quantum mechanics in the case of two physical quantities described by non-commuting operators, the knowledge of one precludes the knowledge of the other. Then either (1) the description of reality given by the wave function in quantum mechanics is not complete or (2) these two quantities cannot have simultaneous reality. Consideration of the problem of making predictions concerning a system on the basis of measurements made on another system that had previously interacted with it leads to the result that if (1) is false then (2) is also false, One is thus led to conclude that the description of reality given by the wave function is not complete.' back

Electrical telegraph - Wikipedia, Electrical telegraph - Wikipedia,the free encyclopedia, 'An electrical telegraph is a telegraph that uses electrical signals, usually conveyed via dedicated telecommunication lines or radio.. . . In a matter of decades after their creation, electrical telegraph networks permitted people and commerce to transmit messages across both continents and oceans almost instantly, with widespread social and economic impacts.' back

ENIAC - Wikipedia, ENIAC - Wikipediam, the free encyclopedia, 'ENIAC (Electronic Numerical Integrator And Computer)[was the first electronic general-purpose computer. It was Turing-complete, digital, and could solve "a large class of numerical problems" through reprogramming. Although ENIAC was designed and primarily used to calculate artillery firing tables for the United States Army's Ballistic Research Laboratory,its first programs included a study of the feasibility of the thermonuclear weapon.' back

Feynman Lectures on Physics, Vol. I, chapter 22, Algebra, 'In our study of oscillating systems we shall have occasion to use one of the most remarkable, almost astounding, formulas in all of mathematics. From the physicist’s point of view we could bring forth this formula in two minutes or so, and be done with it. But science is as much for intellectual enjoyment as for practical utility, so instead of just spending a few minutes on this amazing jewel, we shall surround the jewel by its proper setting in the grand design of that branch of mathematics which is called elementary algebra.' back

Feynman, Leighton & Sands, FLP III:1 Quantum behaviour, '[Heisenberg] proposed, as a general principle, his uncertainty principle, which we can state in terms of our experiment as follows: “It is impossible to design an apparatus to determine which hole the electron passes through, that will not at the same time disturb the electrons enough to destroy the interference pattern.” If an apparatus is capable of determining which hole the electron goes through, it cannot be so delicate that it does not disturb the pattern in an essential way.' back

Feynman, Leighton & Sands FLP III:01, Chapter 1: Quantum Behaviour, 'The gradual accumulation of information about atomic and small-scale behavior during the first quarter of the 20th century, which gave some indications about how small things do behave, produced an increasing confusion which was finally resolved in 1926 and 1927 by Schrödinger, Heisenberg, and Born. They finally obtained a consistent description of the behavior of matter on a small scale. We take up the main features of that description in this chapter.' back

Feynman, Leighton & Sands, I:22, Feynman Lectures on Physics: I:22 Algebra, 'In our study of oscillating systems we shall have occasion to use one of the most remarkable, almost astounding, formulas in all of mathematics. From the physicist’s point of view we could bring forth this formula in two minutes or so, and be done with it. But science is as much for intellectual enjoyment as for practical utility, so instead of just spending a few minutes on this amazing jewel, we shall surround the jewel by its proper setting in the grand design of that branch of mathematics which is called elementary algebra.' back

Function (mathematics) - Wikipedia, Function (mathematics) - Wikipedia, the free encyclopedia, 'The mathematical concept of a function expresses dependence between two quantities, one of which is given (the independent variable, argument of the function, or its "input") and the other produced (the dependent variable, value of the function, or "output"). A function associates a single output with every input element drawn from a fixed set, such as the real numbers.' back

Function space - Wikipedia, Function space - Wikipedia, the free encyclopedia, 'In mathematics, a function space is a set of functions of a given kind from a set X to a set Y. It is called a space because in many applications, it is a topological space or a vector space or both' back

Georg Cantor - Wikipedia, Georg Cantor - Wikipedia, the free encyclopedia, Georg Ferdinand Ludwig Philipp Cantor (March 3 [O.S. February 19] 1845 – January 6, 1918) was a German mathematician, born in Russia. He is best known as the creator of set theory, which has become a fundamental theory in mathematics. Cantor established the importance of one-to-one correspondence between sets, defined infinite and well-ordered sets, and proved that the real numbers are "more numerous" than the natural numbers. In fact, Cantor's theorem implies the existence of an "infinity of infinities". He defined the cardinal and ordinal numbers and their arithmetic. Cantor's work is of great philosophical interest, a fact of which he was well aware of.' back

Gregory J. Chaitin, Gödel's Theorem and Information, 'Abstract: Gödel's theorem may be demonstrated using arguments having an information-theoretic flavor. In such an approach it is possible to argue that if a theorem contains more information than a given set of axioms, then it is impossible for the theorem to be derived from the axioms. In contrast with the traditional proof based on the paradox of the liar, this new viewpoint suggests that the incompleteness phenomenon discovered by Gödel is natural and widespread rather than pathological and unusual.'
International Journal of Theoretical Physics 21 (1982), pp. 941-954 back

Heisenberg picture - Wikipedia, Heisenberg picture - Wikipedia, the free encyclopedia, 'In physics, the Heisenberg picture (also called the Heisenberg representation) is a formulation (largely due to Werner Heisenberg in 1925) of quantum mechanics in which the operators (observables and others) incorporate a dependency on time, but the state vectors are time-independent, an arbitrary fixed basis rigidly underlying the theory.' back

Hilbert space - Wikipedia, Hilbert space - Wikipedia, the free encyclopedia, ' The mathematical concept of a Hilbert space, named after David Hilbert, generalizes the notion of Euclidean space. It extends the methods of vector algebra and calculus from the two-dimensional Euclidean plane and three-dimensional space to spaces with any finite or infinite number of dimensions. A Hilbert space is a vector space equipped with an inner product, an operation that allows defining lengths and angles. Furthermore, Hilbert spaces are complete, which means that there are enough limits in the space to allow the techniques of calculus to be used. ' back

History of quantum mechanics - Wikipedia, History of quantum mechanics - Wikipedia, the free encyclopedia, 'The history of quantum mechanics, as it interlaces with the history of quantum chemistry, began essentially with a number of different scientific discoveries: the 1838 discovery of cathode rays by Michael Faraday; the 1859-1860 winter statement of the black body radiation problem by Gustav Kirchhoff; the 1877 suggestion by Ludwig Boltzmann that the energy states of a physical system could be discrete; the discovery of the photoelectric effect by Heinrich Hertz in 1887; and the 1900 quantum hypothesis by Max Planck that any energy-radiating atomic system can theoretically be divided into a number of discrete "energy elements". . . ' back

History of the Internet - Wikipedia, History of the Internet - Wikipedia, the free encyclopedia, ' The history of the Internet begins with the development of electronic computers in the 1950s. Initial concepts of packet networking originated in several computer science laboratories in the United States, United Kingdom, and France. The US Department of Defense awarded contracts as early as the 1960s for packet network systems, including the development of the ARPANET (which would become the first network to use the Internet Protocol).' back

Ineffability - Wikipedia, Ineffability - Wikipedia, the free encyclpedia, 'Ineffability is concerned with ideas that cannot or should not be expressed in spoken words (or language in general), often being in the form of a taboo or incomprehensible term. This property is commonly associated with philosophy, aspects of existence, and similar concepts that are inherently "too great", complex, or abstract to be adequately communicated. In addition, illogical statements, principles, reasons, and arguments may be considered intrinsically ineffable along with impossibilities, contradictions, and paradoxes.' back

Internet - Wikipedia, Internet - Wikipedia, the free encyclopedia, 'The Internet is a global system of interconnected computer networks that use the standard Internet protocol suite (TCP/IP) to link several billion devices worldwide. It is an international network of networks that consists of millions of private, public, academic, business, and government packet switched networks, linked by a broad array of electronic, wireless, and optical networking technologies.' back

Internet protocol suite - Wikipedia, Internet protocol suite - Wikipedia, the freeencyclopedia, ' The Internet protocol suite is the conceptual model and set of communications protocols used in the Internet and similar computer networks. It is commonly known as TCP/IP because the foundational protocols in the suite are the Transmission Control Protocol (TCP) and the Internet Protocol (IP). During its development, versions of it were known as the Department of Defense (DoD) model because the development of the networking method was funded by the United States Department of Defense through DARPA. Its implementation is a protocol stack.' back

Interpersonal relationship - Wikipedia, Interpersonal relationship - Wikipedia, the free encyclopedia, 'An interpersonal relationship is a strong, deep, or close association or acquaintance between two or more people that may range in duration from brief to enduring. This association may be based on inference, love, solidarity, regular business interactions, or some other type of social commitment. Interpersonal relationships are formed in the context of social, cultural and other influences. The context can vary from family or kinship relations, friendship, marriage, relations with associates, work, clubs, neighborhoods, and places of worship. They may be regulated by law, custom, or mutual agreement, and are the basis of social groups and society as a whole.' back

Invention of the telephone - Wikipedia, Invention of the telephone - Wikipedia, the free encylopedia, 'The invention of the telephone was the culmination of work done by many individuals, and involved an array of lawsuits founded upon the patent claims of several individuals and numerous companies.' back

Iris (mythology) - Wikipedia, Iris (mythology) - Wikipedia, the free encyclopdia, 'In Greek mythology, Iris is the personification of the rainbow and messenger of the gods. She is also known as one of the goddesses of the sea and the sky. Iris links the gods to humanity. She travels with the speed of wind from one end of the world to the other, and into the depths of the sea and the underworld.' back

Isaac Newton, The Method of Fluxions and Infinite Series, ' Publication date MDCCXXXVI [1736] Topics Mathematics, Calculus Publisher London : Printed by Henry Woodfall; and sold by John Nourse ... ' Collection: johnadamsBPL; bostonpubliclibrary; americana Digitizing sponsor: Sloan Foundation Contributor: John Adams Library at the Boston Public Library Language: English John Adams Library copy: significant annotations in Adams's hand. An unfinished posthumous work, first published in the Latin original in v. 1 of the Opera omnia (Londini, J. Nichols, 1779-85) under title: Artis analyticae specimina, vel Geometria analytica. Another translation, without Colson's commentary, appeared London, 1737 as A treatise on the method of fluxions and infinite series.' back

Jerome Gellman, Mysticism (Stanford Encyclopedia of Philosohy), 'The term ‘mysticism,’ comes from the Greek μυω, meaning “to conceal.” In the Hellenistic world, ‘mystical’ referred to “secret” religious rituals. In early Christianity the term came to refer to “hidden” allegorical interpretations of Scriptures and to hidden presences, such as that of Jesus at the Eucharist. . . . Typically, mystics, theistic or not, see their mystical experience as part of a larger undertaking aimed at human transformation . . . and not as the terminus of their efforts. Thus, in general, ‘mysticism’ would best be thought of as a constellation of distinctive practices, discourses, texts, institutions, traditions, and experiences aimed at human transformation, variously defined in different traditions. back

John von Neumann (2014), Mathematical Foundations of Quantum Mechanics, ' Mathematical Foundations of Quantum Mechanics by John von Neumann translated from the German by Robert T. Beyer (New Edition) edited by Nicholas A. Wheeler. Princeton UP Princeton & Oxford. Preface: ' This book is the realization of my long-held intention to someday use the resources of TEX to produce a more easily read version of Robert T. Beyer’s authorized English translation (Princeton University Press, 1955) of John von Neumann’s classic Mathematische Grundlagen der Quantenmechanik (Springer, 1932).'This content downloaded from 129.127.145.240 on Sat, 30 May 2020 22:38:31 UTC back

Kurt Gödel I, On formally undecidable propositions of Principia Mathematica and related systems I, '1 Introduction The development of mathematics towards greater exactness has, as is well-known, lead to formalization of large areas of it such that you can carry out proofs by following a few mechanical rules. The most comprehensive current formal systems are the system of Principia Mathematica (PM) on the one hand, the Zermelo-Fraenkelian axiom-system of set theory on the other hand. These two systems are so far developed that you can formalize in them all proof methods that are currently in use in mathematics, i.e. you can reduce these proof methods to a few axioms and deduction rules. Therefore, the conclusion seems plausible that these deduction rules are sufficient to decide all mathematical questions expressible in those systems. We will show that this is not true, but that there are even relatively easy problem in the theory of ordinary whole numbers that can not be decided from the axioms. This is not due to the nature of these systems, but it is true for a very wide class of formal systems, which in particular includes all those that you get by adding a finite number of axioms to the above mentioned systems, provided the additional axioms don’t make false theorems provable.' back

Laplace's demon - Wikipedia, Laplace's demon - Wikipedia, the free encyclopedia, ' We may regard the present state of the universe as the effect of its past and the cause of its future. An intellect which at a certain moment would know all forces that set nature in motion, and all positions of all items of which nature is composed, if this intellect were also vast enough to submit these data to analysis, it would embrace in a single formula the movements of the greatest bodies of the universe and those of the tiniest atom; for such an intellect nothing would be uncertain and the future just like the past would be present before its eyes.' A Philosophical Essay on Probabilities, Essai philosophique dur les probabilites introduction to the second edition of Theorie analytique des probabilites based on a lecture given in 1794. back

Light cone - Wikipedia, Light cone - Wikipedia, the free encyclopedia, 'A Light cone is the path that a flash of light, emanating from a single event E (localized to a single point in space and a single moment in time) and traveling in all directions, would take through spacetime. Imagine the light confined to a two-dimensional plane, the light from the flash spreads out in a circle after the event E occurs—and when graphed the growing circle with the vertical axis of the graph representing time, the result is a cone, known as the future light cone (some animated diagrams depicting this concept can be seen here.) ' back

Linear continuum - Wikipedia, Linear continuum - Wikipedia, the free encyclopedia, 'In the mathematical field of order theory, a continuum or linear continuum is a generalization of the real line. Formally, a linear continuum is a linearly ordered set S of more than one element that is densely ordered, i.e., between any two members there is another, and which "lacks gaps" in the sense that every non-empty subset with an upper bound has a least upper bound.' back

Martin A Nowak, Joshua B. Plotkin & Vincent A.A. Jansen, The evolution of syntacic communication, 'Animal communication is typically non-syntactic, which means that signals refer to whole situations. Human language is syntactic, and signals consist of discrete components that have their own meaning. Syntax is a prerequisite for taking advantage of combinatorics, that is, "making infinite use of finite means''. The vast expressive power of human language would be impossible without syntax, and the transition from non-syntactic to syntactic communication was an essential step in the evolution of human language. . . . ' back

Mathematical formulation of quantum mechanics - Wikipedia, Mathematical formulation of quantum mechanics - Wikipedia - the free encyclopedia, 'The mathematical formulation of quantum mechanics is the body of mathematical formalisms which permits a rigorous description of quantum mechanics. It is distinguished from mathematical formalisms for theories developed prior to the early 1900s by the use of abstract mathematical structures, such as infinite-dimensional Hilbert spaces and operators on these spaces. Many of these structures were drawn from functional analysis, a research area within pure mathematics that developed in parallel with, and was influenced by, the needs of quantum mechanics. In brief, values of physical observables such as energy and momentum were no longer considered as values of functions on phase space, but as eigenvalues of linear operators.' back

Matrix mechanics - Wikipedia, Matrix mechanics - Wikipedia, the free encyclopedia, 'Matrix mechanics is a formulation of quantum mechanics created by Werner Heisenberg, Max Born, and Pascual Jordan in 1925. Matrix mechanics was the first conceptually autonomous and logically consistent formulation of quantum mechanics. It extended the Bohr Model by describing how the quantum jumps occur. It did so by interpreting the physical properties of particles as matrices that evolve in time. It is equivalent to the Schrödinger wave formulation of quantum mechanics, and is the basis of Dirac's bra-ket notation for the wave function. back

Measurement in quantum mechanics - Wikipedia, Measurement in quantum mechanics - Wikipedia, the free encyclopedia, 'The framework of quantum mechanics requires a careful definition of measurement. The issue of measurement lies at the heart of the problem of the interpretation of quantum mechanics, for which there is currently no consensus.' back

Measurement uncertainty - Wikipedia, Measurement uncertainty - Wikipedia, the free encyclopedia, 'In metrology, measurement uncertainty is a non-negative parameter characterizing the dispersion of the values attributed to a measured quantity. The uncertainty has a probabilistic basis and reflects incomplete knowledge of the quantity. All measurements are subject to uncertainty and a measured value is only complete if it is accompanied by a statement of the associated uncertainty.' back

Meliorism - Wikipedia, Meliorism - Wikipedia, the free encyclopedia, 'Meliorism is an idea in metaphysical thinking holding that progress is a real concept leading to an improvement of the world. It holds that humans can, through their interference with processes that would otherwise be natural, produce an outcome which is an improvement over the aforementioned natural one. Meliorism, as a conception of the person and society, is at the foundation of contemporary liberal democracy and human rights and is a basic component of liberalism.' back

Minkowski space - Wikipedia, Minkowski space - Wikipedia, the free encyclopedia, ' In mathematical physics, Minkowski space or Minkowski spacetime is a combination of Euclidean space and time into a four-dimensional manifold where the spacetime interval between any two events is independent of the inertial frame of reference in which they are recorded. Although initially developed by mathematician Hermann Minkowski for Maxwell's equations of electromagnetism, the mathematical structure of Minkowski spacetime was shown to be an immediate consequence of the postulates of special relativity.' back

Moore's law - Wikipedia, Moore's law - Wikipedia, the free encyclopedia, ' Moore's law is the observation that the number of transistors in a dense integrated circuit (IC) doubles about every two years. Moore's law is an observation and projection of a historical trend. Rather than a law of physics, it is an empirical relationship linked to gains from experience in production.' back

Morse code - Wikipedia, Morse code - Wikipedia, the free encyclopedia, 'Morse code is a method of transmitting text information as a series of on-off tones, lights, or clicks that can be directly understood by a skilled listener or observer without special equipment.' back

Natural number - Wikipedia, Natural number - Wikipedia, the free encyclopedia, 'In mathematics, the natural numbers are those used for counting ("there are six coins on the table") and ordering ("this is the third largest city in the country"). These purposes are related to the linguistic notions of cardinal and ordinal numbers, respectively (see English numerals). A later notion is that of a nominal number, which is used only for naming.' back

Nick Huggett (Stanford Encyclopedia of Philosophy), Zeno's Paradoxes, 'Almost everything that we know about Zeno of Elea is to be found in the opening pages of Plato's Parmenides. There we learn that Zeno was nearly 40 years old when Socrates was a young man, say 20. Since Socrates was born in 469 BC we can estimate a birth date for Zeno around 490 BC. Beyond this, really all we know is that he was close to Parmenides (Plato reports the gossip that they were lovers when Zeno was young), and that he wrote a book of paradoxes defending Parmenides' philosophy. Sadly this book has not survived, and what we know of his arguments is second-hand, principally through Aristotle and his commentators (here I have drawn particularly on Simplicius, who, though writing a thousand years after Zeno, apparently possessed at least some of his book).' back

Nowak, Plotkin & Jansen (2000), The evolution of syntacic communication, 'Animal communication is typically non-syntactic, which means that signals refer to whole situations. Human language is syntactic, and signals consist of discrete components that have their own meaning. Syntax is a prerequisite for taking advantage of combinatorics, that is, "making infinite use of finite means''. The vast expressive power of human language would be impossible without syntax, and the transition from non-syntactic to syntactic communication was an essential step in the evolutionof human language. . . . ' back

Observable - Wikipedia, Observable - Wikipedia, the free encyclopedia, 'In physics, particularly in quantum physics, a system observable is a measurable operator, or gauge, where the property of the system state can be determined by some sequence of physical operations. For example, these operations might involve submitting the system to various electromagnetic fields and eventually reading a value. In systems governed by classical mechanics, any experimentally observable value can be shown to be given by a real-valued function on the set of all possible system states.' back

Ordinal number - Wikipedia, Ordinal number - Wikipedia, the free encyclopedia, 'Whereas the notion of cardinal number is associated with a set with no particular structure on it, the ordinals are intimately linked with the special kind of sets that are called well-ordered (so intimately linked, in fact, that some mathematicians make no distinction between the two concepts). A well-ordered set is a totally ordered set (given any two elements one defines a smaller and a larger one in a coherent way) in which there is no infinite decreasing sequence (however, there may be infinite increasing sequences); equivalently, every non-empty subset of the set has a least element. Ordinals may be used to label the elements of any given well-ordered set (the smallest element being labelled 0, the one after that 1, the next one 2, "and so on") and to measure the "length" of the whole set by the least ordinal that is not a label for an element of the set. This "length" is called the order type of the set.' back

OSI model - Wikipedia, OSI model - Wikipedia, the free encyclopedia, 'The Open Systems Interconnection model (OSI model) is a conceptual model that characterizes and standardizes the communication functions of a telecommunication or computing system without regard to its underlying internal structure and technology. Its goal is the interoperability of diverse communication systems with standard protocols. The model partitions a communication system into abstraction layers. The original version of the model defined seven layers.' back

Permutation group - Wikipedia, Permutation group - Wikipedia, the free encyclopedia, 'In mathematics, a permutation group is a group G whose elements are permutations of a given set M and whose group operation is the composition of permutations in G (which are thought of as bijective functions from the set M to itself). The group of all permutations of a set M is the symmetric group of M, often written as Sym(M). The term permutation group thus means a subgroup of the symmetric group. If M = {1,2,...,n} then, Sym(M), the symmetric group on n letters is usually denoted by Sn. The way in which the elements of a permutation group permute the elements of the set is called its group action. Group actions have applications in the study of symmetries, combinatorics and many other branches of mathematics, physics and chemistry.' back

Photon - Wikipedia, Photon - Wikipedia, the free encyclopedia, 'A photon is an elementary particle, the quantum of all forms of electromagnetic radiation including light. It is the force carrier for electromagnetic force, even when static via virtual photons. The photon has zero rest mass and as a result, the interactions of this force with matter at long distance are observable at the microscopic and macroscopic levels.' back

Planck constant - Wikipedia, Planck constant - Wikipedia, the free encyclopedia, ' Since energy and mass are equivalent, the Planck constant also relates mass to frequency. By 2017, the Planck constant had been measured with sufficient accuracy in terms of the SI base units, that it was central to replacing the metal cylinder, called the International Prototype of the Kilogram (IPK), that had defined the kilogram since 1889. . . . For this new definition of the kilogram, the Planck constant, as defined by the ISO standard, was set to 6.626 070 150 × 10-34 J⋅s exactly. ' back

Planck-Einstein relation - Wikipedia, Planck-Einstein relation - Wikipedia, the free encyclopedia, 'The Planck–Einstein relation. . . refers to a formula integral to quantum mechanics, which states that the energy of a photon (E) is proportional to its frequency (ν). E = hν. The constant of proportionality, h, is known as the Planck constant.' back

Quantum electrodynamics - Wikipedia, Quantum electrodynamics - Wikipedia, the free encyclopedia, 'In particle physics, quantum electrodynamics (QED) is the relativistic quantum field theory of electrodynamics. In essence, it describes how light and matter interact and is the first theory where full agreement between quantum mechanics and special relativity is achieved. QED mathematically describes all phenomena involving electrically charged particles interacting by means of exchange of photons and represents the quantum counterpart of classical electromagnetism giving a complete account of matter and light interaction.' back

Quantum mechanics - Wikipedia, Quantum mechanics - Wikipedia, the free encyclopedia, 'Quantum mechanics (QM; also known as quantum physics or quantum theory), including quantum field theory, is a fundamental branch of physics concerned with processes involving, for example, atoms and photons. In such processes, said to be quantized, the action has been observed to be only in integer multiples of the Planck constant. This is utterly inexplicable in classical physics.'' back

Quantum state - Wikipedia, Quantum state - Wikipedia, the free encyclopedia, 'In quantum physics, a quantum state is a mathematical entity that provides a probability distribution for the outcomes of each possible measurement on a system. Knowledge of the quantum state together with the rules for the system's evolution in time exhausts all that can be predicted about the system's behavior. A mixture of quantum states is again a quantum state. Quantum states that cannot be written as a mixture of other states are called pure quantum states, while all other states are called mixed quantum states. A pure quantum state can be represented by a ray in a Hilbert space over the complex numbers, while mixed states are represented by density matrices, which are positive semidefinite operators that act on Hilbert spaces.' back

Quantum superposition - Wikipedia, Quantum superposition - Wikipedia, the free encyclopedia, 'Quantum superposition is the application of the superposition principle to quantum mechanics. The superposition principle is the addition of the amplitudes of waves from interference. In quantum mechanics it is the sum of wavefunction amplitudes, or state vectors. It occurs when an object simultaneously "possesses" two or more possible values for an observable quantity (e.g. the position or energy of a particle)' back

Random-access memory - Wikipedia, Random-access memory - Wikipedia, the free encyclopedia, 'Random-access memory (RAM ) is a form of computer data storage. A random-access memory device allows data items to be read or written in almost the same amount of time irrespective of the physical location of data inside the memory. back

Renaud Joannes-Boyau, New Moroccan fossils suggest humans lived and evolved across Africa 100,000 years earlier than we thought, 'The earliest known existence of modern humans, or Homo sapiens, was previously dated to be around 200,000 years ago. It’s a view supported by genetic analysis and dated Homo sapiens fossils (Omo Kibish, estimated age 195,000 years, and Herto, estimated age 160,000 years), both found in modern-day Ethiopia, East Africa. But new research, published today in two Nature papers, offers a fresh perspective. The latest studies suggest that Homo sapiens spread across the entire African continent more than 100,000 years earlier than previously thought.' back

Roman Roads - Wikipedia, Roman Roads - Wikipedia, the free encyclopedia, 'Roman roads (Latin: viae; singular: via meaning way) were physical infrastructure vital to the maintenance and development of the Roman state, and were built from about 300 BC through the expansion and consolidation of the Roman Republic and the Roman Empire. . . .At the peak of Rome's development, no fewer than 29 great military highways radiated from the capital, and the late Empire's 113 provinces were interconnected by 372 great roads. The whole comprised more than 400,000 km (250,000 mi) of roads, of which over 80,500 kilometres (50,000 mi) were stone-paved.' back

Schrödinger equation - Wikipedia, Schrödinger equation - Wikipedia, the free encyclopedia, ' In quantum mechanics, the Schrödinger equation is a partial differential equation that describes how the quantum state of a quantum system changes with time. It was formulated in late 1925, and published in 1926, by the Austrian physicist Erwin Schrödinger. . . . In classical mechanics Newton's second law, (F = ma), is used to mathematically predict what a given system will do at any time after a known initial condition. In quantum mechanics, the analogue of Newton's law is Schrödinger's equation for a quantum system (usually atoms, molecules, and subatomic particles whether free, bound, or localized). It is not a simple algebraic equation, but in general a linear partial differential equation, describing the time-evolution of the system's wave function (also called a "state function").' back

Signal-to-noise ratio - Wikipedia, Signal-to-noise ratio - Wikipedia, the free encyclopedia, 'Signal-to-noise ratio (often abbreviated SNR or S/N) is a measure used in science and engineering that compares the level of a desired signal to the level of background noise. It is defined as the ratio of signal power to the noise power. A ratio higher than 1:1 indicates more signal than noise. While SNR is commonly quoted for electrical signals, it can be applied to any form of signal (such as isotope levels in an ice core or biochemical signaling between cells). Signal-to-noise ratio is sometimes used informally to refer to the ratio of useful information to false or irrelevant data in a conversation or exchange. For example, in online discussion forums and other online communities, off-topic posts and spam are regarded as "noise" that interferes with the "signal" of appropriate discussion.' back

Simon Bone and Matias Castro, A Brief History of Quantum Computing, ' Strange as it sounds, the computer of tomorrow could be built around a cup of coffee. The caffeine molecule is just one of the possible building blocks of a 'quantum computer', a new type of computer that promises to provide mind boggling performance that can break secret codes in a matter of seconds.' back

Spirituality - Wikipedia, Spirituality - Wikipedia, the free encylopedia, ' Modern usages tend to refer to a subjective experience of a sacred dimension and the "deepest values and meanings by which people live", often in a context separate from organized religious institutions, such as a belief in a supernatural (beyond the known and observable) realm, personal growth, a quest for an ultimate or sacred meaning, religious experience, or an encounter with one's own "inner dimension".' back

Stanley Burris, Hilbert and Ackermann's 1928 Logic Book, 'David Hilbert was particularly interested in the foundations of mathematics. Among many other things, he is famous for his attempt to axiomatize mathematics. This now classic text is his treatment of symbolic logic. It lays the groundwork for his later work with Bernays. This translation is based on the second German edition, and has been modified according to the criticisms of Church and Quine. In particular, the authors' original formulation of Gödel's completeness proof for the predicate calculus has been updated. In the first half of the twentieth century, an important debate on the foundations of mathematics took place. Principles of Mathematical Logic represents one of Hilbert's important contributions to that debate. Although symbolic logic has grown considerably in the subsequent decades, this book remains a classic.' back

Symmetry - Wikipedia, Symmetry - Wikipedia, the free encyclopedia, 'Symmetry (from Greek συμμετρία symmetria "agreement in dimensions, due proportion, arrangement") in everyday language refers to a sense of harmonious and beautiful proportion and balance. In mathematics, "symmetry" has a more precise definition, that an object is invariant to a transformation, such as reflection but including other transforms too. Although these two meanings of "symmetry" can sometimes be told apart, they are related, so they are here discussed together.' back

Synapse - Wikipedia, Synapse - Wikipedia, the free encyclopedia, 'In the nervous system, a synapse is a structure that permits a neuron (or nerve cell) to pass an electrical or chemical signal to another cell (neural or otherwise). Santiago Ramón y Cajal proposed that neurons are not continuous throughout the body, yet still communicate with each other, an idea known as the neuron doctrine The word "synapse" (from Greek synapsis "conjunction," from synaptein "to clasp," from syn- "together" and haptein "to fasten") was introduced in 1897 by English physiologist Michael Foster at the suggestion of English classical scholar Arthur Woollgar Verrall.' back

Telecommunications industry - Wikipedia, Telecommunications industry - Wikipedia, the free encyclopedia, ' The telecommunications industries within the sector of information and communication technology is made up of all telecommunications/telephone companies and internet service providers and plays the crucial role in the evolution of mobile communications and the information society. . . . Digital subscriber line (DSL) is the main broadband telecom technology. The fastest growth comes from (value-added) services delivered over mobile networks. . . . Think of telecommunications as the world's biggest machine. Strung together by complex networks, telephones, mobile phones and internet-linked PCs, the global system touches nearly all of us. [Investopedia]' back

Telephone exchange - Wikipedia, Telephone exchange - Wikipedia, the free encyclopedia, 'A telephone exchange is a telecommunications system used in the public switched telephone network or in large enterprises. An exchange consists of electronic components and in older systems also human operators that interconnect (switch) telephone subscriber lines or virtual circuits of digital systems to establish telephone calls between subscribers.' back

Thomas Ainsworth (Stanford Encyclopedia of Philosophy), Form vs. Matter, 'Aristotle famously contends that every physical object is a compound of matter and form. This doctrine has been dubbed “hylomorphism”, a portmanteau of the Greek words for matter (hulê) and form (eidos or morphê). Highly influential in the development of Medieval philosophy, Aristotle’s hylomorphism has also enjoyed something of a renaissance in contemporary metaphysics.' back

URL - Wikipedia, URL - Wikipedia, the free encyclopedia, ' A Uniform Resource Locator (URL), colloquially termed a web address, is a reference to a web resource that specifies its location on a computer network and a mechanism for retrieving it. A URL is a specific type of Uniform Resource Identifier (URI) although many people use the two terms interchangeably. URLs occur most commonly to reference web pages (http), but are also used for file transfer (ftp), email (mailto), database access (JDBC), and many other applications.' back

Visual system - Wikipedia, Visual system - Wikipedia, the free encyclopedia, 'The visual system is the part of the central nervous system which gives organisms the ability to process visual detail, as well as enabling the formation of several non-image photo response functions. It detects and interprets information from visible light to build a representation of the surrounding environment.' back

Werner Heisenberg, Quantum-theoretical re-interpretation of kinematic and mechanical relations, 'The present paper seeks to establish a basis for theoretical quantum mechanics founded exclusively upon relationships between quantities which in principle are observable.' back

Wireless - Wikipedia, Wireless - Wikipedia, the free encyclopedia, 'Wireless communication is the transfer of information or power between two or more points that are not connected by an electrical conductor. back

Wojciech Hubert Zurek, Quantum origin of quantum jumps: breaking of unitary symmetry induced by information transfer and the transition from quantum to classical, 'Submitted on 17 Mar 2007 (v1), last revised 18 Mar 2008 (this version, v3)) Measurements transfer information about a system to the apparatus, and then further on – to observers and (often inadvertently) to the environment. I show that even imperfect copying essential in such situations restricts possible unperturbed outcomes to an orthogonal subset of all possible states of the system, thus breaking the unitary symmetry of its Hilbert space implied by the quantum superposition principle. Preferred outcome states emerge as a result. They provide framework for the “wavepacket collapse”, designating terminal points of quantum jumps, and defining the measured observable by specifying its eigenstates.' back

Zeno's paradoxes - Wikipedia, Zeno's paradoxes - Wikipedia, the free encyclopedia, 'Zeno's paradoxes are a set of problems generally thought to have been devised by Zeno of Elea to support Parmenides's doctrine that "all is one" and that, contrary to the evidence of our senses, the belief in plurality and change is mistaken, and in particular that motion is nothing but an illusion.' back

www.scientific_theology.com is maintained by The Theology Company Proprietary Limited ACN 097 887 075 ABN 74 097 887 075 Copyright 2000-2021 © Jeffrey Nicholls