Friday, May 23, 2008

Selecting Life

Lee Smolin’s LQG theory regarding the origins of the universe, favours the selection and evolution of larger universes over time. Larger universes are more likely to generate life and more likely to enable life to develop to its full potential. Smaller universes would spawn fewer suns, fewer supernovas and therefore fewer black holes *from which new universes might emerge. Also smaller universes would create fewer heavier elements such as carbon, nitrogen, sulphur and oxygen, from which the complex molecules of life are created.

The Anthropic principle gains an extra dimension of significance also through a combination of Smolin’s theory and D-Net; because it can be inferred that a universe with the laws and constants such as ours is required for life to kick start as an efficient information processor and progress on its inevitable path towards higher intelligence.
The Anthropic principle in all its manifestations now has a broader significance. Not only must those parts of the universe in which life currently exists conform to the set of boundary conditions that produce elements capable of being linked in complex molecular forms to form the first cells or templates of life; but the life that emerges must also exist as a category of efficient information processor, in order to survive.
Further, the fact that life currently exists in our cosmos, evolving towards greater complexity, which in D-Net is based on a decision-based selection process, means that life has been selected as the most efficient information processing engine by the environment of the universe, which according to the new LQG model may itself be a *causal web of quantum processes and information. The selection process will then inevitably lead to the emergence of more efficient information processors or a subset of systems which can evolve by processing information as well as energy. This is of *central significance because it indicates that life and consciousness is critical to the survival of the universe and allows the emergence of a super-intelligent entity- Omega. Within this context the significance of the final observer or ultimate Omega becomes apparent. Its knowledge and wisdom will encompasses and be co-existent *with the entire universe- part of the causal evolutionary web of existence.

In addition, the primeval mythological notion of a god can be more precisely defined; as a communal intelligence pervading the universe - an infinitely complex network of networks and system of systems connecting all life, with each node an evolutionary information processor; but inseparable from the whole; an emergent and continuously unfolding phenomena leveraging to higher and higher levels of wisdom and sentience- a quantum god defined in a Hilbert decision space.

Frank Tipler in his marvellous ground breaking book, The Anthropic Cosmological Principle, also postulates that life could survive forever. Within a particular closed universe configuration, observers or intelligent life forms would be able to send an infinite number of light rays back and forth between themselves, so generating an infinite amount of information. But there may be a corollary. Such a scenario also may extend the life of the universe as well as life, if the two are co-existent. In other words, if life can process an infinite amount of information it can extend the life of universe as observed by life, to infinity.

Therefore in order to guarantee its own survival, the multiverse or universal cosmic environment, will select those physical forces, states and universal laws most likely to generate efficient information processors viz- life as we know it.

The Grand Synthesis- Putting it Together

LQG- Loop Quantum Gravity has now evolved into a synthesis of a collection of theories and models all predicated on various interpretations of knots, links and braids.

In 2005 – an Australian physicist - Sundance Bilson-Thompson now collaborating with Lee Smolin, published a paper on the Preon model, which postulates that electrons, quarks and neutrinos or any other matter or energy particle included in the Standard Model, can be generated from smaller hypothetical particles that carry electric charge- Preons (5)

The particular particle generated is a function of the topological structure or braiding, based on the particle’s interaction with others and their environment.
Preons possess length and width and interact by braiding – crossing over and under each other as world lines in spacetime. Three preons combine to form a particle- each preon possessing a third of charge of an electron or neutrino or any other particle in the standard model. Braids in spacetime might in fact be the source of matter and energy. It has now been confirmed that the braiding of quantum spacetime can in fact produce the lightest particles of the standard model.

Simultaneously it has been postulated by another physicist working with the Smolin team- Fotini Markopoulou, that spacetime braids might also facilitate the operations of quantum computing, with the universe seen as giant quantum computer- each quantum of LQG’s space equivalent to a quantum bit of information. In such a model, space itself may not exist, but becomes a web of information.

In 1987 a mathematician- Vaughan Jones, applied Von Neumann Algebras to prove that quantum mechanical variables such as energy, position and momentum are related to the topology of knots. This work also proved that knots could be used to represent information as a matrix of 0s and 1s and also represent the logic gates or operators necessary for quantum computation.

In 1988, Michael Freedman discussed the possibility of using quantum topology for computation based on the discovery that the invariant properties of knots were associated with the quantum physics of a two-dimensional surface evolving in time.

Edward Witten in 1989 also showed that the spatial motion of quantum particles can create braiding in spacetime, whose strands are the histories of individual particles. Performing measurements on a braided system of quantum particles can therefore be equivalent to performing the computation that a knot or braid encodes for.

Alexei Kitaev proved in 1997 that Witten’s topological braiding interpretation of quantum theory can form the basis of computation, by weaving world lines around each other in 4D spacetime to create non-Abelian anyons, similar to preons, with one third the electric charge of an electron. In this model non-abelian anyons occur in pairs each carrying equal and opposite amounts of ‘topological charge’ (6).
The state of a quantum computer is stored in the conserved charges that the anyons carry. By choosing the appropriate braiding patterns the quantum logic operations of computation can be encoded. Recent experiments in the field of fractional quantum Hall physics have demonstrated the possibility of the real existence of anyons.

The new LQG model, supporting theories which incorporate the braiding of spacetime, provide strong support for the D-Net model and constitute the third evidential basis for the theory. D-Net postulates that the logic operators required to drive the quantum decisions necessary to achieve evolution in the universe may be derived from the same braiding process that generates the preon and anyon models of quantum particles and information as defined above.

Further supporting evidence may be derived from Professor Edward Fredkin’s model which has extended the concept of Cellula Automata made famous by John Conway’s 1970 Game of Life. In this model rules are applied to a 2D grid of cells, from which patterns emerge which sometimes appear to model living entities.
Stephen Wolfram in his book- A New Kind of Science extended this basic model to incorporate multiple dimensions and more sophisticated rule and provided many examples of how simple rules can generate very complex patterns that mimic those found in the natural world.

In Fredkin’s model, the state of the braided patterns correspond to bits of information. The cellular automaton is described as a loom in which the warp is time and weaving takes place line by line. At each time step a cross-thread of the tapestry of the universe is laid down. Where a cross-thread crosses the warp there are two braiding possibilities- under or over.
A binary decision is made as each line of weaving unfolds. This is an evolving cellula automata.
In Fredkin’s theory the way the automata tapestry evolves is controlled by the dominant Rule governingf a set of sub-rules, which applies to every cell in the evolving grid and a clock governing discrete time steps.
This model therefore reinforces and overlaps with LQG's model of discrete space and time.

However the model also postulates that space is not defined by the dimensions or scale of the cellular matrix, but by the paths that the free particles take in the array based on the information carried by each particle. The physics of the reality that life perceives is a projection of information out from the matrix.

By swapping bits in different states according to the Rule, Fredkin’s loom generates quantum forces and matter particles as well as the laws of physics such as the conservation of energy and momentum.
It is proposed that the Fredkin Rule-set governing the evolution of a quantum information universe is a variant of the principles governing a unified evolutionary model such as D-Net.

It is further proposed that the current overlapping models of reality, incorporating the new braiding concepts of LQG, which in turn may encompass the Wolfram and Fredkin cellular automata-based braiding models, can be further extended and unified by the author’s D-Net Model.

In summary, the D-Net model, incorporating Frieden’s information theory, has the capacity to link a Unified Information-based model of Evolution with a Unified Information model of the Physical Universe.
All Life may therefore be integrated within the fundamental nature of the universe as an emergent sentient information processing entity.

The universe therefore may be represented as an infinite dimensional Quantum network or mesh, with each node acting as a quantum computer. In this model, space time matter and energy do not exist as the primary or foundational components but as emergent properties of the decision process network.

Life’s ‘consciousness’ may provide the mechanism to produce the evolutionary selection state outcomes from the D-Net decision nodes. Information, matter, energy and spacetime may therefore all be generated from the same decision processes- the braiding or knotting of quantised relational flows through the network.

Quantum Networks

Quantum networks

In addition, quantum network theory which defines the topology of quantum nodes and the propagation and processing of information via entanglement, provides support for LQG and D-Net.

A quantum network is an ensemble of nodes between which, with a certain probability, is a connection. That is they exhibit a certain degree of entanglement. It is therefore necessary to create efficient protocols that maximise the probability of achieving maximum entanglement between any of the nodes. The protocols resort to the concepts of classical information theory (percolation theory), but they substantially enhance their efficiency by enlisting and utilising quantum phenomena.

One such example is applying repeaters in classical networks to prevent the exponential decay of the signal with the number of nodes. There is no direct analogue to this in quantum information theory, but quantum mechanics affords greater possibilities of manipulating quantum bits in order to obtain the information completely.

The fundamental difference with classical systems is that in quantum networks it is no longer necessary to consider the channels and nodes separately. The network is defined as a single quantum state shared by the nodes, optimised as a global entanglement distribution.

It is also possible under these conditions, for different protocols to lead to very different probabilities of achieving maximum entanglement between different nodes. For some special cases such asone and two-dimensional networks with special regular geometry, protocols are derived that are distinctly superior to classical percolation protocols. For the case of a one-dimensional chain the optimum protocol was found. Even under conditions where the signal would decay exponentially in a classic system, it is possible to achieve zero-loss transmission of quantum information. (Quantum repeaters may thus be regarded as simple quantum networks allowing quantum communication over long distances).

The calculations show that the system passes through a kind of phase transition with respect to the degree of entanglement: below a certain threshold value for the degree of entanglement the percolation. In this case the transmission from A to B is zero. Above this value the percolation assumes a certain fixed value that is now independent of the distance between the nodes.

The entanglement distribution in a quantum network thus defines a framework in which statistical methods and concepts such as classical percolation theory quite naturally find application. This leads to a new kind of critical phenomenon, viz an entanglement phase transition. The appropriate critical parameter is the minimum entanglement necessary to establish a perfect quantum channel over long distances.

Accordingly the percolation probability does not decrease exponentially as the distance or number of nodes. The further development of quantum networks calls for a better understanding of such entanglement and percolation strategies.

Cause and Effect

Causal Set Theory developed by Rafael Sorkin in 1987 provides a key underpinning to LQG as a theory of Quantum Gravity and to the author’s Decision Network Theory . it postulates that it is not necessary to know the geometry of space-time- only the causal order of all elements within it.
Causal elements may be clustered into sets or Causets, clusters of space-time quanta or events related by causal connections. It is an order relation.
These causal elements or quanta link in causal networks and grow over time corresponding to the expansion rate of space-time. At the same time the theory predicts accurately this expansion rate or the current value for the cosmological constant.
A Causet, to be more precise, is a discrete set of elements - the basic spacetime building blocks or "elementary events". But whereas in the continuum, spacetime is described, mathematically speaking, by an elaborate web of relationships among the point-events carrying information about contiguity, smoothness, distances and times, for the elements of a causet the only relational information carried is a "partial (or quasi-) order" - for some pairs x, y of elements- the information that x comes before y, or, in other cases, that x comes after y. This ordering is the microscopic counterpart of the macroscopic relation of before and after in time.
This conceptual framework shares quite a lot in common with LQG. Like LQG it argues that space-time is a dynamic and discrete entity and evolves through quantized networks, according to the order in which events take place in time- that is within a causal order.
It corresponds however, even closer to the conceptual framework of the D-Net model, because of its abstract causal nature and independence from geometric constraints. The causet nodes reflect the decision nodes of the D-Net framework.

Thursday, May 22, 2008

The Causal Quantum Connection

The concept of a causal cosmos has been extensively defined by Lee Smolin within the theory of Loop Quantum Gravity (4). The major elements of an information-based causal cosmos also support this author’s decision/information-based theory of evolution. The major elements of such a cosmos include the following propositions-

Time and causality are synonymous- there is no notion of a moment of time; only processes and decisions that follow one another by causal necessity.
A causal universe may be modelled by the transfer of information, creating a network of information transfers between nodes. The nodes can represent volumes of space at the Planck level or at a more abstract level- intersecting decision points.
Is time therefore an emergent property of information transfer?


Non-locality can be explained by an evolutionary network. Because all points are simultaneously connected, it is perhaps possible for information to take short cuts and link events simultaneously across the network.

A network model as applied in both LQG and D-Net is also capable of explaining and predicting other anomalous features of the universe such as chaos. It’s important to note that although a chaotic system cannot evolve in the traditional sense because the system and environmental changes are too erratic and unpredictable for adaptation to occur in realtime, such systems have been shown to exhibit underlying determinism, based on the phenomenon of strange attractors. Although somewhat speculative, these underlying deterministic patterns existing within chaos and randomness may be more accessible via a theory of decision networks.

The universe can therefore be modelled as the outcome of a form of computational process, but one which can evolve in time as a consequence of the information flowing through it, guided by a network of decisions. Information input/output is therefore a form of story; the narrative of which is a flow of causal processes represented by flows of information but without an ending- just the continuous evolution of a self-organising cosmos with life in all its manifestatiions, as the main actor.

The universe of events is above all a relational universe and the most important relationship is causality. Events may have more than one contributing cause and an event may contribute to more than one future event. The universe therefore has time built into it from the beginning, with time and causality being synonymous. Processes and decisions follow one another by causal necessity. One way to describe such a universe is by information states and it is therefore possible to create a network of information transfers between the decision points or nodes of those states.
Roger Penrose’s Twistor Theory also defines a network of causal relationships between spacetime events without needing to specify locations in space, by mapping geometric objects of space-time into geometric objects of a 4D complex space.

Evolution can therefore be modelled as a process of causal information flows or transfers that are self-organising, guided by a network of decisions as in a neural network, each neuron representing a decision node which fires and transfers information only when a particular decision threshold is reached. Evolution uses the decision network for extracting the most useful information in a particular context, aggregating and sifting events such as genetic mutations and transferring the result once a decision threshold has been reached.

The latest theories of quantum gravity, in particular LQG, have a number of concepts in common with D-Net as mentioned previously. LQG defines space, not as a background stage, but a network of relationships that connect events and processes in the universe, evolving in time. In this model, based on quantum angular momentum or spin networks, the universe is purely a dynamic network of relationships without a spatial coordinate framework. Spin networks therefore do not live in space- they generate space and quantum information This is the same type of model on which general relativity is also based, which is not surprising as LQG is a quantized version of GR.
D-Net is predicated on a similar network of relationships, mirroring the information flows driving evolution, in the same way that a social network selectively exchanges information between members.

In the LQG model, the topological braiding patterns of its spacetime nodes can encode quantum particles. In the D-Net model, braids encode the quantum logic operations driving its decision processes.

Evolutionary decision networks and LQG spin networks can therefore be defined within a common conceptual framework, both representing causal information flows with their nodes acting as mediators of the flow. Evolutionary networks however operate at a more abstract level but are still dependent on the relationship between the system and its environment, which evolves over time allowing the spin network to create a more intricate entity- a spin foam. Similarly, a decision network evolves over time, modelling the evolution of a system in time, creating the analogue of a spin foam- a ‘decision foam’.

In LQG, each observer has only limited and partial information about the universe, each with a different view that needs to be reconciled with all other observer’s perspectives. In the D-Net model, each system’s evolution also has to be reconciled with all others, to build a composite view of the universe’s evolution; but each system must interrelate with all others. It is also a consistent theory because the different evolutionary states of the system are correlated. The decision outcomes of each system are correlated with those of all other systems, as are the observer’s partial information frameworks in LQG. Context dependence is also central to the mathematical formulation of the theory- the context of the history of the observer. It resolves the paradox of quantum superpositions by making it a consequence of one’s point of view or decision states related to the Wheeler de Whitt equations or a superposition of evolving entities.

It can also be shown that superposition-based quantum networks can be based on the classical perceptron model of multilayered, feedforward neural networks and the algebraic model of evolving quantum structures as described in quantum gravity. (ref-) The main feature of this model is its evolution from particular neural topologies to a quantum metastructure which embodies many differing topological patterns. Using quantum parallelism, evolutionary training is possible on superpositions of different network topologies. As a result, not only classical transition functions, but also the topology of the quantum metastructure becomes a subject of evolutionary training.
This suggests that decision networks also can be considered as candidates for this model in the same way- as topologically evolving quantum networks.

D-Net also helps solve the conundrum of Everett’s Many Worlds Interpretation of the universe. In the LQG model the universe gradually unfolds and is continually presented with alternate pathways of development or splits in histories. In the Many World’s Interpretation, instead of the universe splitting each time a decision is made by the observer, the quantum decision processes in the brain create new divergent pathways through a different sequence or networks of events in a Hilbert quantum space, creating new observer dependent worlds, but not new physical universes. By adding decision processes to the mix, D-Net can help explain how each divergent path is selected.
The decision processes of D-Net can therefore be linked to the quantum processes of spacetime in a quantized theory such as LQG. The uncertainty of information flowing through the network is also consistent with Heisenberg’s Uncertainty principle and is also at the heart of Frieden’s information action principle. In addition D-Net helps solve the problem of future time travellers returning to the past. The travellers would reach their destination but then travel through different spin or decision networks, avoiding the risk of tangling with present states.

To understand complex systems such as people or cultures we need to know not just their causal histories as in LQG, but also their evolutionary histories; why they took the evolutionary paths they did. Physical events alone and the information they generate do not fully encompass the complexity of life. Complex entities like people and cultures are processes unfolding in time. A full understanding of life’s emergence can only be fully understood by understanding their evolutionary decision pathways- the network connections between events.

D-Net reflects the central role of the evolving system, creating pathways through evolutionary events in spacetime. It is therefore consistent with LQG in terms of combining information from a vast number of decisions over evolutionary time, reflecting the context and history of the system- its evolutionary options and choices.
In the evolutionary model, the system is not only an integral part of the evolving universe- it eventually subsumes it.

In LQG a set of questions can be specified about the history of the universe and the probabilities of the different answers computed. The questions or decision options in a sense bring the reality into existence. This is precisely the premise of Frieden’s theory and also D-Net. According to Smolin it is possible to construct a pluralistic version of quantum cosmology in which there is one universe but many histories or mathematical versions of what an observer can see. It is consistent only when two observers ask the same question and agree on the answer. In LQG, there exists one world seen as a jigsaw or overarching composite structure by many different observers; not multiple worlds as seen by one final observer from outside it. The universe in this model therefore contains many different consistent histories, each of which can be brought into existence by asking the right set of questions and measuring the information outcomes. This could describe an infinite set of quantum worlds, each of which corresponds to part of the world seen by a particular observer at a particular place and time in the history of the universe.

Asking the right questions however presupposes making the right decisions on what questions to ask. This corresponds to the role of the observer as postulated in D-Net. Life as the set of multiple observers expands and evolves to become co-existent with the universe; an Omega or Universal Consciousness. D-net therefore ensures that the right decisions are made to ensure that the universe maintains its integrity and life realises its full potential. As life and the universe merge into one co-existent sentient entity, the universe is enabled to reach its full potential.

Wednesday, May 21, 2008

Untangling Time- Back to the Future

The equations of physics are generally time-symmetric, with no difference in function between the future and past. For example the theory of electrodynamics is based on waves travelling backwards and forwards in time.

Several experiments have been devised to test the notion of reverse or retro-causality including a variation of the classical quantum mechanics wave/particle experiment, delaying measurement until after the of photons had passed through the double slit. However this was not is not deemed sufficiently rigorous as a proof, as the path the photon took before the measurement could not be verified.

Physicist John Cramer has recently proposed a more rigorous test using the ‘transactional interpretation’ of quantum mechanics, which proposes that particles interact by sending and receiving physical waves that travel backwards and forwards through time. He has devised an experiment based on the entanglement of photons and their properties such as momentum, which then share the same wave or particle behaviour (7) (8) .

Pairs of photons from a laser beam are entangled before being sent along two different paths. One stream is delayed by several microseconds by sending it through a 10k optical fibre cable. A moveable detector can then be used to sense a photon in two positions, as either a wave or particle.
Choosing to measure the delayed photons as waves or particles forces the photons in the other beam to be detected the same way, because they are entangled. This choice therefore influences the measurement of the entangled photon even though it is made earlier.

If retro-causality is proved, it could solve the major enigma of quantum entanglement and non-locality; already defined within the context of the D-net model, but at the same time would require a complete reformulation of the laws of causality.

Measuring one entangled particle could send a wave backwards through time to the moment at which the pair was created without being limited by the speed of light. Retro-causality might also help explain why we live in a universe so finely tuned to the existence of life as Paul Davies has speculated, The universe may be able to retrofit its parameters using reverse causality to ensure the emergence of life as we know it.

D-Net provides a possible mechanism to achieve such a reverse engineering process, by allowing retracing of the pathways encoded as evolutionary histories. The parameters of the big bang could then be adjusted by a reverse evolutionary process to provide a way of accelerating the selection of a form of life capable of evolving into a more efficient information processor. The presence of intelligent observers later in history could therefore exert an influence on the big bang to reverse engineer the most appropriate conditions for the emergence of an optimally efficient form of life.

The D-Net model would allow this possibility because a causal network would allow both reverse or forward pathways to be established as the most efficient history.
Biologically it has already been shown that life can be reverse engineered, for example to produce chickens with teeth and snakes with limbs, by reversing evolutionary pathways.

In addition the latest interpretation of the Anthropic Principle suggests a much broader range of physical settings governing the creation of a universe which could lead to the evolution of life as we know it(9). For example an alternate pathway has recently been formulated allowing life to evolve without the existence of the weak force. It may be possible to define many other evolutionary pathways that could result in an alternate intelligent life form, capable of efficiently processing information.

Friday, May 16, 2008

The Unitary Quantum Connection

Computing is a process of mapping from inputs to outputs. A quantum computer performs a physical mapping from initial quantum states to final classical states. This solution mapping process is called the ‘unitary evolution’ (3) of the states of the system.

Solving a problem also represents a mapping from an information input parameter set to an output solution set. NP problems do not allow solutions to be reached under the constraint of serial computing because, as the input parameter space expands, the number of possible mapping pathways to reach an output solution also expands exponentially. NP problem solutions can be more efficiently solved by parallel quantum computing, the power of which increases exponentially at a rate comparable to the exponential increase in the solution set pathways or unitary evolution of the system. This can be modelled as a ‘sum over histories’ or path integral formulation of quantum theory.

The D-Net model is also supported by the Unitary Quantum Transform process as follows-
As with computing and problem solving processes, evolution also may be thought of as a mapping from the system’s present information state to the target state of the environment. The Action in this case is the integral of the decision paths over an ‘evolutionary field’. The Least Action minimises this integral which is the mathematical process in the author’s theory that drives evolution. The D-Net evolutionary decision network provides a model of this mapping mechanism. The model predicts many different decision paths in the configuration space from input to output, but only a small proportion- the most efficient, are relevant to the system. The other paths are those with the smallest probability of success and can be cancelled out.

Each transition path, including the decision nodes embedded in it, creates a decision history. The sum of these decision histories, provides the evolutionary mapping function and may be integrated to provide the quantum decision integral or sum over histories, for the evolutionary process. Life therefore is an extremely efficient information processor because it incorporates adaptive learning decision pathways that over time maximise the efficiency of the information differential reduction process, within a unitary evolution framework.

In a sense the decision paths become shorter and more efficient, enabling the optimal evolutionary outcome or unitary transform to be achieved at the lowest energy cost,
just as the geodesic in Riemann geometry represents the shortest or most efficient path between two points. There therefore appears to be a strong connection between Riemann geometric functions and evolutionary decision functions, which allows the formulation of a metric to calculate the efficiency of the evolutionary process. In effect the minimal geodesic distance between the information state of the observing system and the target information state of the system’s environment, applying a unitary operator, is equivalent to the minimal number of ‘decision gates or nodes’ and the number of connecting links or steps required to achieve the appropriate adaptation.

In this process the complexity of the system is increased, through the acquisition of additional information, plus greater flexibility and efficiency in resolving the differential, which can therefore be applied to adapt to more complex future environmental challenges. Such a metric would therefore be a function of the density of decision operations or transformations required to implement the unitary evolutionary transform plus a measure of the efficiency of the information feedback mechanism to keep the evolutionary process on track. In effect, the evolution of the ‘decision matrix’ applied to minimise the information differential, allows the evolution of the system as a whole to be achieved. This in turn could be calculated from Frieden’s Theory combined with existing Quantum Information, Unitary and Network Theory.