Lee Smolin’s LQG theory regarding the origins of the universe, favours the selection and evolution of larger universes over time. Larger universes are more likely to generate life and more likely to enable life to develop to its full potential. Smaller universes would spawn fewer suns, fewer supernovas and therefore fewer black holes *from which new universes might emerge. Also smaller universes would create fewer heavier elements such as carbon, nitrogen, sulphur and oxygen, from which the complex molecules of life are created.
The Anthropic principle gains an extra dimension of significance also through a combination of Smolin’s theory and D-Net; because it can be inferred that a universe with the laws and constants such as ours is required for life to kick start as an efficient information processor and progress on its inevitable path towards higher intelligence.
The Anthropic principle in all its manifestations now has a broader significance. Not only must those parts of the universe in which life currently exists conform to the set of boundary conditions that produce elements capable of being linked in complex molecular forms to form the first cells or templates of life; but the life that emerges must also exist as a category of efficient information processor, in order to survive.
Further, the fact that life currently exists in our cosmos, evolving towards greater complexity, which in D-Net is based on a decision-based selection process, means that life has been selected as the most efficient information processing engine by the environment of the universe, which according to the new LQG model may itself be a *causal web of quantum processes and information. The selection process will then inevitably lead to the emergence of more efficient information processors or a subset of systems which can evolve by processing information as well as energy. This is of *central significance because it indicates that life and consciousness is critical to the survival of the universe and allows the emergence of a super-intelligent entity- Omega. Within this context the significance of the final observer or ultimate Omega becomes apparent. Its knowledge and wisdom will encompasses and be co-existent *with the entire universe- part of the causal evolutionary web of existence.
In addition, the primeval mythological notion of a god can be more precisely defined; as a communal intelligence pervading the universe - an infinitely complex network of networks and system of systems connecting all life, with each node an evolutionary information processor; but inseparable from the whole; an emergent and continuously unfolding phenomena leveraging to higher and higher levels of wisdom and sentience- a quantum god defined in a Hilbert decision space.
Frank Tipler in his marvellous ground breaking book, The Anthropic Cosmological Principle, also postulates that life could survive forever. Within a particular closed universe configuration, observers or intelligent life forms would be able to send an infinite number of light rays back and forth between themselves, so generating an infinite amount of information. But there may be a corollary. Such a scenario also may extend the life of the universe as well as life, if the two are co-existent. In other words, if life can process an infinite amount of information it can extend the life of universe as observed by life, to infinity.
Therefore in order to guarantee its own survival, the multiverse or universal cosmic environment, will select those physical forces, states and universal laws most likely to generate efficient information processors viz- life as we know it.
Friday, May 23, 2008
The Grand Synthesis- Putting it Together
LQG- Loop Quantum Gravity has now evolved into a synthesis of a collection of theories and models all predicated on various interpretations of knots, links and braids.
In 2005 – an Australian physicist - Sundance Bilson-Thompson now collaborating with Lee Smolin, published a paper on the Preon model, which postulates that electrons, quarks and neutrinos or any other matter or energy particle included in the Standard Model, can be generated from smaller hypothetical particles that carry electric charge- Preons (5)
The particular particle generated is a function of the topological structure or braiding, based on the particle’s interaction with others and their environment.
Preons possess length and width and interact by braiding – crossing over and under each other as world lines in spacetime. Three preons combine to form a particle- each preon possessing a third of charge of an electron or neutrino or any other particle in the standard model. Braids in spacetime might in fact be the source of matter and energy. It has now been confirmed that the braiding of quantum spacetime can in fact produce the lightest particles of the standard model.
Simultaneously it has been postulated by another physicist working with the Smolin team- Fotini Markopoulou, that spacetime braids might also facilitate the operations of quantum computing, with the universe seen as giant quantum computer- each quantum of LQG’s space equivalent to a quantum bit of information. In such a model, space itself may not exist, but becomes a web of information.
In 1987 a mathematician- Vaughan Jones, applied Von Neumann Algebras to prove that quantum mechanical variables such as energy, position and momentum are related to the topology of knots. This work also proved that knots could be used to represent information as a matrix of 0s and 1s and also represent the logic gates or operators necessary for quantum computation.
In 1988, Michael Freedman discussed the possibility of using quantum topology for computation based on the discovery that the invariant properties of knots were associated with the quantum physics of a two-dimensional surface evolving in time.
Edward Witten in 1989 also showed that the spatial motion of quantum particles can create braiding in spacetime, whose strands are the histories of individual particles. Performing measurements on a braided system of quantum particles can therefore be equivalent to performing the computation that a knot or braid encodes for.
Alexei Kitaev proved in 1997 that Witten’s topological braiding interpretation of quantum theory can form the basis of computation, by weaving world lines around each other in 4D spacetime to create non-Abelian anyons, similar to preons, with one third the electric charge of an electron. In this model non-abelian anyons occur in pairs each carrying equal and opposite amounts of ‘topological charge’ (6).
The state of a quantum computer is stored in the conserved charges that the anyons carry. By choosing the appropriate braiding patterns the quantum logic operations of computation can be encoded. Recent experiments in the field of fractional quantum Hall physics have demonstrated the possibility of the real existence of anyons.
The new LQG model, supporting theories which incorporate the braiding of spacetime, provide strong support for the D-Net model and constitute the third evidential basis for the theory. D-Net postulates that the logic operators required to drive the quantum decisions necessary to achieve evolution in the universe may be derived from the same braiding process that generates the preon and anyon models of quantum particles and information as defined above.
Further supporting evidence may be derived from Professor Edward Fredkin’s model which has extended the concept of Cellula Automata made famous by John Conway’s 1970 Game of Life. In this model rules are applied to a 2D grid of cells, from which patterns emerge which sometimes appear to model living entities.
Stephen Wolfram in his book- A New Kind of Science extended this basic model to incorporate multiple dimensions and more sophisticated rule and provided many examples of how simple rules can generate very complex patterns that mimic those found in the natural world.
In Fredkin’s model, the state of the braided patterns correspond to bits of information. The cellular automaton is described as a loom in which the warp is time and weaving takes place line by line. At each time step a cross-thread of the tapestry of the universe is laid down. Where a cross-thread crosses the warp there are two braiding possibilities- under or over.
A binary decision is made as each line of weaving unfolds. This is an evolving cellula automata.
In Fredkin’s theory the way the automata tapestry evolves is controlled by the dominant Rule governingf a set of sub-rules, which applies to every cell in the evolving grid and a clock governing discrete time steps.
This model therefore reinforces and overlaps with LQG's model of discrete space and time.
However the model also postulates that space is not defined by the dimensions or scale of the cellular matrix, but by the paths that the free particles take in the array based on the information carried by each particle. The physics of the reality that life perceives is a projection of information out from the matrix.
By swapping bits in different states according to the Rule, Fredkin’s loom generates quantum forces and matter particles as well as the laws of physics such as the conservation of energy and momentum.
It is proposed that the Fredkin Rule-set governing the evolution of a quantum information universe is a variant of the principles governing a unified evolutionary model such as D-Net.
It is further proposed that the current overlapping models of reality, incorporating the new braiding concepts of LQG, which in turn may encompass the Wolfram and Fredkin cellular automata-based braiding models, can be further extended and unified by the author’s D-Net Model.
In summary, the D-Net model, incorporating Frieden’s information theory, has the capacity to link a Unified Information-based model of Evolution with a Unified Information model of the Physical Universe.
All Life may therefore be integrated within the fundamental nature of the universe as an emergent sentient information processing entity.
The universe therefore may be represented as an infinite dimensional Quantum network or mesh, with each node acting as a quantum computer. In this model, space time matter and energy do not exist as the primary or foundational components but as emergent properties of the decision process network.
Life’s ‘consciousness’ may provide the mechanism to produce the evolutionary selection state outcomes from the D-Net decision nodes. Information, matter, energy and spacetime may therefore all be generated from the same decision processes- the braiding or knotting of quantised relational flows through the network.
In 2005 – an Australian physicist - Sundance Bilson-Thompson now collaborating with Lee Smolin, published a paper on the Preon model, which postulates that electrons, quarks and neutrinos or any other matter or energy particle included in the Standard Model, can be generated from smaller hypothetical particles that carry electric charge- Preons (5)
The particular particle generated is a function of the topological structure or braiding, based on the particle’s interaction with others and their environment.
Preons possess length and width and interact by braiding – crossing over and under each other as world lines in spacetime. Three preons combine to form a particle- each preon possessing a third of charge of an electron or neutrino or any other particle in the standard model. Braids in spacetime might in fact be the source of matter and energy. It has now been confirmed that the braiding of quantum spacetime can in fact produce the lightest particles of the standard model.
Simultaneously it has been postulated by another physicist working with the Smolin team- Fotini Markopoulou, that spacetime braids might also facilitate the operations of quantum computing, with the universe seen as giant quantum computer- each quantum of LQG’s space equivalent to a quantum bit of information. In such a model, space itself may not exist, but becomes a web of information.
In 1987 a mathematician- Vaughan Jones, applied Von Neumann Algebras to prove that quantum mechanical variables such as energy, position and momentum are related to the topology of knots. This work also proved that knots could be used to represent information as a matrix of 0s and 1s and also represent the logic gates or operators necessary for quantum computation.
In 1988, Michael Freedman discussed the possibility of using quantum topology for computation based on the discovery that the invariant properties of knots were associated with the quantum physics of a two-dimensional surface evolving in time.
Edward Witten in 1989 also showed that the spatial motion of quantum particles can create braiding in spacetime, whose strands are the histories of individual particles. Performing measurements on a braided system of quantum particles can therefore be equivalent to performing the computation that a knot or braid encodes for.
Alexei Kitaev proved in 1997 that Witten’s topological braiding interpretation of quantum theory can form the basis of computation, by weaving world lines around each other in 4D spacetime to create non-Abelian anyons, similar to preons, with one third the electric charge of an electron. In this model non-abelian anyons occur in pairs each carrying equal and opposite amounts of ‘topological charge’ (6).
The state of a quantum computer is stored in the conserved charges that the anyons carry. By choosing the appropriate braiding patterns the quantum logic operations of computation can be encoded. Recent experiments in the field of fractional quantum Hall physics have demonstrated the possibility of the real existence of anyons.
The new LQG model, supporting theories which incorporate the braiding of spacetime, provide strong support for the D-Net model and constitute the third evidential basis for the theory. D-Net postulates that the logic operators required to drive the quantum decisions necessary to achieve evolution in the universe may be derived from the same braiding process that generates the preon and anyon models of quantum particles and information as defined above.
Further supporting evidence may be derived from Professor Edward Fredkin’s model which has extended the concept of Cellula Automata made famous by John Conway’s 1970 Game of Life. In this model rules are applied to a 2D grid of cells, from which patterns emerge which sometimes appear to model living entities.
Stephen Wolfram in his book- A New Kind of Science extended this basic model to incorporate multiple dimensions and more sophisticated rule and provided many examples of how simple rules can generate very complex patterns that mimic those found in the natural world.
In Fredkin’s model, the state of the braided patterns correspond to bits of information. The cellular automaton is described as a loom in which the warp is time and weaving takes place line by line. At each time step a cross-thread of the tapestry of the universe is laid down. Where a cross-thread crosses the warp there are two braiding possibilities- under or over.
A binary decision is made as each line of weaving unfolds. This is an evolving cellula automata.
In Fredkin’s theory the way the automata tapestry evolves is controlled by the dominant Rule governingf a set of sub-rules, which applies to every cell in the evolving grid and a clock governing discrete time steps.
This model therefore reinforces and overlaps with LQG's model of discrete space and time.
However the model also postulates that space is not defined by the dimensions or scale of the cellular matrix, but by the paths that the free particles take in the array based on the information carried by each particle. The physics of the reality that life perceives is a projection of information out from the matrix.
By swapping bits in different states according to the Rule, Fredkin’s loom generates quantum forces and matter particles as well as the laws of physics such as the conservation of energy and momentum.
It is proposed that the Fredkin Rule-set governing the evolution of a quantum information universe is a variant of the principles governing a unified evolutionary model such as D-Net.
It is further proposed that the current overlapping models of reality, incorporating the new braiding concepts of LQG, which in turn may encompass the Wolfram and Fredkin cellular automata-based braiding models, can be further extended and unified by the author’s D-Net Model.
In summary, the D-Net model, incorporating Frieden’s information theory, has the capacity to link a Unified Information-based model of Evolution with a Unified Information model of the Physical Universe.
All Life may therefore be integrated within the fundamental nature of the universe as an emergent sentient information processing entity.
The universe therefore may be represented as an infinite dimensional Quantum network or mesh, with each node acting as a quantum computer. In this model, space time matter and energy do not exist as the primary or foundational components but as emergent properties of the decision process network.
Life’s ‘consciousness’ may provide the mechanism to produce the evolutionary selection state outcomes from the D-Net decision nodes. Information, matter, energy and spacetime may therefore all be generated from the same decision processes- the braiding or knotting of quantised relational flows through the network.
Quantum Networks
Quantum networks
In addition, quantum network theory which defines the topology of quantum nodes and the propagation and processing of information via entanglement, provides support for LQG and D-Net.
A quantum network is an ensemble of nodes between which, with a certain probability, is a connection. That is they exhibit a certain degree of entanglement. It is therefore necessary to create efficient protocols that maximise the probability of achieving maximum entanglement between any of the nodes. The protocols resort to the concepts of classical information theory (percolation theory), but they substantially enhance their efficiency by enlisting and utilising quantum phenomena.
One such example is applying repeaters in classical networks to prevent the exponential decay of the signal with the number of nodes. There is no direct analogue to this in quantum information theory, but quantum mechanics affords greater possibilities of manipulating quantum bits in order to obtain the information completely.
The fundamental difference with classical systems is that in quantum networks it is no longer necessary to consider the channels and nodes separately. The network is defined as a single quantum state shared by the nodes, optimised as a global entanglement distribution.
It is also possible under these conditions, for different protocols to lead to very different probabilities of achieving maximum entanglement between different nodes. For some special cases such asone and two-dimensional networks with special regular geometry, protocols are derived that are distinctly superior to classical percolation protocols. For the case of a one-dimensional chain the optimum protocol was found. Even under conditions where the signal would decay exponentially in a classic system, it is possible to achieve zero-loss transmission of quantum information. (Quantum repeaters may thus be regarded as simple quantum networks allowing quantum communication over long distances).
The calculations show that the system passes through a kind of phase transition with respect to the degree of entanglement: below a certain threshold value for the degree of entanglement the percolation. In this case the transmission from A to B is zero. Above this value the percolation assumes a certain fixed value that is now independent of the distance between the nodes.
The entanglement distribution in a quantum network thus defines a framework in which statistical methods and concepts such as classical percolation theory quite naturally find application. This leads to a new kind of critical phenomenon, viz an entanglement phase transition. The appropriate critical parameter is the minimum entanglement necessary to establish a perfect quantum channel over long distances.
Accordingly the percolation probability does not decrease exponentially as the distance or number of nodes. The further development of quantum networks calls for a better understanding of such entanglement and percolation strategies.
In addition, quantum network theory which defines the topology of quantum nodes and the propagation and processing of information via entanglement, provides support for LQG and D-Net.
A quantum network is an ensemble of nodes between which, with a certain probability, is a connection. That is they exhibit a certain degree of entanglement. It is therefore necessary to create efficient protocols that maximise the probability of achieving maximum entanglement between any of the nodes. The protocols resort to the concepts of classical information theory (percolation theory), but they substantially enhance their efficiency by enlisting and utilising quantum phenomena.
One such example is applying repeaters in classical networks to prevent the exponential decay of the signal with the number of nodes. There is no direct analogue to this in quantum information theory, but quantum mechanics affords greater possibilities of manipulating quantum bits in order to obtain the information completely.
The fundamental difference with classical systems is that in quantum networks it is no longer necessary to consider the channels and nodes separately. The network is defined as a single quantum state shared by the nodes, optimised as a global entanglement distribution.
It is also possible under these conditions, for different protocols to lead to very different probabilities of achieving maximum entanglement between different nodes. For some special cases such asone and two-dimensional networks with special regular geometry, protocols are derived that are distinctly superior to classical percolation protocols. For the case of a one-dimensional chain the optimum protocol was found. Even under conditions where the signal would decay exponentially in a classic system, it is possible to achieve zero-loss transmission of quantum information. (Quantum repeaters may thus be regarded as simple quantum networks allowing quantum communication over long distances).
The calculations show that the system passes through a kind of phase transition with respect to the degree of entanglement: below a certain threshold value for the degree of entanglement the percolation. In this case the transmission from A to B is zero. Above this value the percolation assumes a certain fixed value that is now independent of the distance between the nodes.
The entanglement distribution in a quantum network thus defines a framework in which statistical methods and concepts such as classical percolation theory quite naturally find application. This leads to a new kind of critical phenomenon, viz an entanglement phase transition. The appropriate critical parameter is the minimum entanglement necessary to establish a perfect quantum channel over long distances.
Accordingly the percolation probability does not decrease exponentially as the distance or number of nodes. The further development of quantum networks calls for a better understanding of such entanglement and percolation strategies.
Cause and Effect
Causal Set Theory developed by Rafael Sorkin in 1987 provides a key underpinning to LQG as a theory of Quantum Gravity and to the author’s Decision Network Theory . it postulates that it is not necessary to know the geometry of space-time- only the causal order of all elements within it.
Causal elements may be clustered into sets or Causets, clusters of space-time quanta or events related by causal connections. It is an order relation.
These causal elements or quanta link in causal networks and grow over time corresponding to the expansion rate of space-time. At the same time the theory predicts accurately this expansion rate or the current value for the cosmological constant.
A Causet, to be more precise, is a discrete set of elements - the basic spacetime building blocks or "elementary events". But whereas in the continuum, spacetime is described, mathematically speaking, by an elaborate web of relationships among the point-events carrying information about contiguity, smoothness, distances and times, for the elements of a causet the only relational information carried is a "partial (or quasi-) order" - for some pairs x, y of elements- the information that x comes before y, or, in other cases, that x comes after y. This ordering is the microscopic counterpart of the macroscopic relation of before and after in time.
This conceptual framework shares quite a lot in common with LQG. Like LQG it argues that space-time is a dynamic and discrete entity and evolves through quantized networks, according to the order in which events take place in time- that is within a causal order.
It corresponds however, even closer to the conceptual framework of the D-Net model, because of its abstract causal nature and independence from geometric constraints. The causet nodes reflect the decision nodes of the D-Net framework.
Causal elements may be clustered into sets or Causets, clusters of space-time quanta or events related by causal connections. It is an order relation.
These causal elements or quanta link in causal networks and grow over time corresponding to the expansion rate of space-time. At the same time the theory predicts accurately this expansion rate or the current value for the cosmological constant.
A Causet, to be more precise, is a discrete set of elements - the basic spacetime building blocks or "elementary events". But whereas in the continuum, spacetime is described, mathematically speaking, by an elaborate web of relationships among the point-events carrying information about contiguity, smoothness, distances and times, for the elements of a causet the only relational information carried is a "partial (or quasi-) order" - for some pairs x, y of elements- the information that x comes before y, or, in other cases, that x comes after y. This ordering is the microscopic counterpart of the macroscopic relation of before and after in time.
This conceptual framework shares quite a lot in common with LQG. Like LQG it argues that space-time is a dynamic and discrete entity and evolves through quantized networks, according to the order in which events take place in time- that is within a causal order.
It corresponds however, even closer to the conceptual framework of the D-Net model, because of its abstract causal nature and independence from geometric constraints. The causet nodes reflect the decision nodes of the D-Net framework.
Thursday, May 22, 2008
The Causal Quantum Connection
The concept of a causal cosmos has been extensively defined by Lee Smolin within the theory of Loop Quantum Gravity (4). The major elements of an information-based causal cosmos also support this author’s decision/information-based theory of evolution. The major elements of such a cosmos include the following propositions-
Time and causality are synonymous- there is no notion of a moment of time; only processes and decisions that follow one another by causal necessity.
A causal universe may be modelled by the transfer of information, creating a network of information transfers between nodes. The nodes can represent volumes of space at the Planck level or at a more abstract level- intersecting decision points.
Is time therefore an emergent property of information transfer?
Non-locality can be explained by an evolutionary network. Because all points are simultaneously connected, it is perhaps possible for information to take short cuts and link events simultaneously across the network.
A network model as applied in both LQG and D-Net is also capable of explaining and predicting other anomalous features of the universe such as chaos. It’s important to note that although a chaotic system cannot evolve in the traditional sense because the system and environmental changes are too erratic and unpredictable for adaptation to occur in realtime, such systems have been shown to exhibit underlying determinism, based on the phenomenon of strange attractors. Although somewhat speculative, these underlying deterministic patterns existing within chaos and randomness may be more accessible via a theory of decision networks.
The universe can therefore be modelled as the outcome of a form of computational process, but one which can evolve in time as a consequence of the information flowing through it, guided by a network of decisions. Information input/output is therefore a form of story; the narrative of which is a flow of causal processes represented by flows of information but without an ending- just the continuous evolution of a self-organising cosmos with life in all its manifestatiions, as the main actor.
The universe of events is above all a relational universe and the most important relationship is causality. Events may have more than one contributing cause and an event may contribute to more than one future event. The universe therefore has time built into it from the beginning, with time and causality being synonymous. Processes and decisions follow one another by causal necessity. One way to describe such a universe is by information states and it is therefore possible to create a network of information transfers between the decision points or nodes of those states.
Roger Penrose’s Twistor Theory also defines a network of causal relationships between spacetime events without needing to specify locations in space, by mapping geometric objects of space-time into geometric objects of a 4D complex space.
Evolution can therefore be modelled as a process of causal information flows or transfers that are self-organising, guided by a network of decisions as in a neural network, each neuron representing a decision node which fires and transfers information only when a particular decision threshold is reached. Evolution uses the decision network for extracting the most useful information in a particular context, aggregating and sifting events such as genetic mutations and transferring the result once a decision threshold has been reached.
The latest theories of quantum gravity, in particular LQG, have a number of concepts in common with D-Net as mentioned previously. LQG defines space, not as a background stage, but a network of relationships that connect events and processes in the universe, evolving in time. In this model, based on quantum angular momentum or spin networks, the universe is purely a dynamic network of relationships without a spatial coordinate framework. Spin networks therefore do not live in space- they generate space and quantum information This is the same type of model on which general relativity is also based, which is not surprising as LQG is a quantized version of GR.
D-Net is predicated on a similar network of relationships, mirroring the information flows driving evolution, in the same way that a social network selectively exchanges information between members.
In the LQG model, the topological braiding patterns of its spacetime nodes can encode quantum particles. In the D-Net model, braids encode the quantum logic operations driving its decision processes.
Evolutionary decision networks and LQG spin networks can therefore be defined within a common conceptual framework, both representing causal information flows with their nodes acting as mediators of the flow. Evolutionary networks however operate at a more abstract level but are still dependent on the relationship between the system and its environment, which evolves over time allowing the spin network to create a more intricate entity- a spin foam. Similarly, a decision network evolves over time, modelling the evolution of a system in time, creating the analogue of a spin foam- a ‘decision foam’.
In LQG, each observer has only limited and partial information about the universe, each with a different view that needs to be reconciled with all other observer’s perspectives. In the D-Net model, each system’s evolution also has to be reconciled with all others, to build a composite view of the universe’s evolution; but each system must interrelate with all others. It is also a consistent theory because the different evolutionary states of the system are correlated. The decision outcomes of each system are correlated with those of all other systems, as are the observer’s partial information frameworks in LQG. Context dependence is also central to the mathematical formulation of the theory- the context of the history of the observer. It resolves the paradox of quantum superpositions by making it a consequence of one’s point of view or decision states related to the Wheeler de Whitt equations or a superposition of evolving entities.
It can also be shown that superposition-based quantum networks can be based on the classical perceptron model of multilayered, feedforward neural networks and the algebraic model of evolving quantum structures as described in quantum gravity. (ref-) The main feature of this model is its evolution from particular neural topologies to a quantum metastructure which embodies many differing topological patterns. Using quantum parallelism, evolutionary training is possible on superpositions of different network topologies. As a result, not only classical transition functions, but also the topology of the quantum metastructure becomes a subject of evolutionary training.
This suggests that decision networks also can be considered as candidates for this model in the same way- as topologically evolving quantum networks.
D-Net also helps solve the conundrum of Everett’s Many Worlds Interpretation of the universe. In the LQG model the universe gradually unfolds and is continually presented with alternate pathways of development or splits in histories. In the Many World’s Interpretation, instead of the universe splitting each time a decision is made by the observer, the quantum decision processes in the brain create new divergent pathways through a different sequence or networks of events in a Hilbert quantum space, creating new observer dependent worlds, but not new physical universes. By adding decision processes to the mix, D-Net can help explain how each divergent path is selected.
The decision processes of D-Net can therefore be linked to the quantum processes of spacetime in a quantized theory such as LQG. The uncertainty of information flowing through the network is also consistent with Heisenberg’s Uncertainty principle and is also at the heart of Frieden’s information action principle. In addition D-Net helps solve the problem of future time travellers returning to the past. The travellers would reach their destination but then travel through different spin or decision networks, avoiding the risk of tangling with present states.
To understand complex systems such as people or cultures we need to know not just their causal histories as in LQG, but also their evolutionary histories; why they took the evolutionary paths they did. Physical events alone and the information they generate do not fully encompass the complexity of life. Complex entities like people and cultures are processes unfolding in time. A full understanding of life’s emergence can only be fully understood by understanding their evolutionary decision pathways- the network connections between events.
D-Net reflects the central role of the evolving system, creating pathways through evolutionary events in spacetime. It is therefore consistent with LQG in terms of combining information from a vast number of decisions over evolutionary time, reflecting the context and history of the system- its evolutionary options and choices.
In the evolutionary model, the system is not only an integral part of the evolving universe- it eventually subsumes it.
In LQG a set of questions can be specified about the history of the universe and the probabilities of the different answers computed. The questions or decision options in a sense bring the reality into existence. This is precisely the premise of Frieden’s theory and also D-Net. According to Smolin it is possible to construct a pluralistic version of quantum cosmology in which there is one universe but many histories or mathematical versions of what an observer can see. It is consistent only when two observers ask the same question and agree on the answer. In LQG, there exists one world seen as a jigsaw or overarching composite structure by many different observers; not multiple worlds as seen by one final observer from outside it. The universe in this model therefore contains many different consistent histories, each of which can be brought into existence by asking the right set of questions and measuring the information outcomes. This could describe an infinite set of quantum worlds, each of which corresponds to part of the world seen by a particular observer at a particular place and time in the history of the universe.
Asking the right questions however presupposes making the right decisions on what questions to ask. This corresponds to the role of the observer as postulated in D-Net. Life as the set of multiple observers expands and evolves to become co-existent with the universe; an Omega or Universal Consciousness. D-net therefore ensures that the right decisions are made to ensure that the universe maintains its integrity and life realises its full potential. As life and the universe merge into one co-existent sentient entity, the universe is enabled to reach its full potential.
Time and causality are synonymous- there is no notion of a moment of time; only processes and decisions that follow one another by causal necessity.
A causal universe may be modelled by the transfer of information, creating a network of information transfers between nodes. The nodes can represent volumes of space at the Planck level or at a more abstract level- intersecting decision points.
Is time therefore an emergent property of information transfer?
Non-locality can be explained by an evolutionary network. Because all points are simultaneously connected, it is perhaps possible for information to take short cuts and link events simultaneously across the network.
A network model as applied in both LQG and D-Net is also capable of explaining and predicting other anomalous features of the universe such as chaos. It’s important to note that although a chaotic system cannot evolve in the traditional sense because the system and environmental changes are too erratic and unpredictable for adaptation to occur in realtime, such systems have been shown to exhibit underlying determinism, based on the phenomenon of strange attractors. Although somewhat speculative, these underlying deterministic patterns existing within chaos and randomness may be more accessible via a theory of decision networks.
The universe can therefore be modelled as the outcome of a form of computational process, but one which can evolve in time as a consequence of the information flowing through it, guided by a network of decisions. Information input/output is therefore a form of story; the narrative of which is a flow of causal processes represented by flows of information but without an ending- just the continuous evolution of a self-organising cosmos with life in all its manifestatiions, as the main actor.
The universe of events is above all a relational universe and the most important relationship is causality. Events may have more than one contributing cause and an event may contribute to more than one future event. The universe therefore has time built into it from the beginning, with time and causality being synonymous. Processes and decisions follow one another by causal necessity. One way to describe such a universe is by information states and it is therefore possible to create a network of information transfers between the decision points or nodes of those states.
Roger Penrose’s Twistor Theory also defines a network of causal relationships between spacetime events without needing to specify locations in space, by mapping geometric objects of space-time into geometric objects of a 4D complex space.
Evolution can therefore be modelled as a process of causal information flows or transfers that are self-organising, guided by a network of decisions as in a neural network, each neuron representing a decision node which fires and transfers information only when a particular decision threshold is reached. Evolution uses the decision network for extracting the most useful information in a particular context, aggregating and sifting events such as genetic mutations and transferring the result once a decision threshold has been reached.
The latest theories of quantum gravity, in particular LQG, have a number of concepts in common with D-Net as mentioned previously. LQG defines space, not as a background stage, but a network of relationships that connect events and processes in the universe, evolving in time. In this model, based on quantum angular momentum or spin networks, the universe is purely a dynamic network of relationships without a spatial coordinate framework. Spin networks therefore do not live in space- they generate space and quantum information This is the same type of model on which general relativity is also based, which is not surprising as LQG is a quantized version of GR.
D-Net is predicated on a similar network of relationships, mirroring the information flows driving evolution, in the same way that a social network selectively exchanges information between members.
In the LQG model, the topological braiding patterns of its spacetime nodes can encode quantum particles. In the D-Net model, braids encode the quantum logic operations driving its decision processes.
Evolutionary decision networks and LQG spin networks can therefore be defined within a common conceptual framework, both representing causal information flows with their nodes acting as mediators of the flow. Evolutionary networks however operate at a more abstract level but are still dependent on the relationship between the system and its environment, which evolves over time allowing the spin network to create a more intricate entity- a spin foam. Similarly, a decision network evolves over time, modelling the evolution of a system in time, creating the analogue of a spin foam- a ‘decision foam’.
In LQG, each observer has only limited and partial information about the universe, each with a different view that needs to be reconciled with all other observer’s perspectives. In the D-Net model, each system’s evolution also has to be reconciled with all others, to build a composite view of the universe’s evolution; but each system must interrelate with all others. It is also a consistent theory because the different evolutionary states of the system are correlated. The decision outcomes of each system are correlated with those of all other systems, as are the observer’s partial information frameworks in LQG. Context dependence is also central to the mathematical formulation of the theory- the context of the history of the observer. It resolves the paradox of quantum superpositions by making it a consequence of one’s point of view or decision states related to the Wheeler de Whitt equations or a superposition of evolving entities.
It can also be shown that superposition-based quantum networks can be based on the classical perceptron model of multilayered, feedforward neural networks and the algebraic model of evolving quantum structures as described in quantum gravity. (ref-) The main feature of this model is its evolution from particular neural topologies to a quantum metastructure which embodies many differing topological patterns. Using quantum parallelism, evolutionary training is possible on superpositions of different network topologies. As a result, not only classical transition functions, but also the topology of the quantum metastructure becomes a subject of evolutionary training.
This suggests that decision networks also can be considered as candidates for this model in the same way- as topologically evolving quantum networks.
D-Net also helps solve the conundrum of Everett’s Many Worlds Interpretation of the universe. In the LQG model the universe gradually unfolds and is continually presented with alternate pathways of development or splits in histories. In the Many World’s Interpretation, instead of the universe splitting each time a decision is made by the observer, the quantum decision processes in the brain create new divergent pathways through a different sequence or networks of events in a Hilbert quantum space, creating new observer dependent worlds, but not new physical universes. By adding decision processes to the mix, D-Net can help explain how each divergent path is selected.
The decision processes of D-Net can therefore be linked to the quantum processes of spacetime in a quantized theory such as LQG. The uncertainty of information flowing through the network is also consistent with Heisenberg’s Uncertainty principle and is also at the heart of Frieden’s information action principle. In addition D-Net helps solve the problem of future time travellers returning to the past. The travellers would reach their destination but then travel through different spin or decision networks, avoiding the risk of tangling with present states.
To understand complex systems such as people or cultures we need to know not just their causal histories as in LQG, but also their evolutionary histories; why they took the evolutionary paths they did. Physical events alone and the information they generate do not fully encompass the complexity of life. Complex entities like people and cultures are processes unfolding in time. A full understanding of life’s emergence can only be fully understood by understanding their evolutionary decision pathways- the network connections between events.
D-Net reflects the central role of the evolving system, creating pathways through evolutionary events in spacetime. It is therefore consistent with LQG in terms of combining information from a vast number of decisions over evolutionary time, reflecting the context and history of the system- its evolutionary options and choices.
In the evolutionary model, the system is not only an integral part of the evolving universe- it eventually subsumes it.
In LQG a set of questions can be specified about the history of the universe and the probabilities of the different answers computed. The questions or decision options in a sense bring the reality into existence. This is precisely the premise of Frieden’s theory and also D-Net. According to Smolin it is possible to construct a pluralistic version of quantum cosmology in which there is one universe but many histories or mathematical versions of what an observer can see. It is consistent only when two observers ask the same question and agree on the answer. In LQG, there exists one world seen as a jigsaw or overarching composite structure by many different observers; not multiple worlds as seen by one final observer from outside it. The universe in this model therefore contains many different consistent histories, each of which can be brought into existence by asking the right set of questions and measuring the information outcomes. This could describe an infinite set of quantum worlds, each of which corresponds to part of the world seen by a particular observer at a particular place and time in the history of the universe.
Asking the right questions however presupposes making the right decisions on what questions to ask. This corresponds to the role of the observer as postulated in D-Net. Life as the set of multiple observers expands and evolves to become co-existent with the universe; an Omega or Universal Consciousness. D-net therefore ensures that the right decisions are made to ensure that the universe maintains its integrity and life realises its full potential. As life and the universe merge into one co-existent sentient entity, the universe is enabled to reach its full potential.
Wednesday, May 21, 2008
Untangling Time- Back to the Future
The equations of physics are generally time-symmetric, with no difference in function between the future and past. For example the theory of electrodynamics is based on waves travelling backwards and forwards in time.
Several experiments have been devised to test the notion of reverse or retro-causality including a variation of the classical quantum mechanics wave/particle experiment, delaying measurement until after the of photons had passed through the double slit. However this was not is not deemed sufficiently rigorous as a proof, as the path the photon took before the measurement could not be verified.
Physicist John Cramer has recently proposed a more rigorous test using the ‘transactional interpretation’ of quantum mechanics, which proposes that particles interact by sending and receiving physical waves that travel backwards and forwards through time. He has devised an experiment based on the entanglement of photons and their properties such as momentum, which then share the same wave or particle behaviour (7) (8) .
Pairs of photons from a laser beam are entangled before being sent along two different paths. One stream is delayed by several microseconds by sending it through a 10k optical fibre cable. A moveable detector can then be used to sense a photon in two positions, as either a wave or particle.
Choosing to measure the delayed photons as waves or particles forces the photons in the other beam to be detected the same way, because they are entangled. This choice therefore influences the measurement of the entangled photon even though it is made earlier.
If retro-causality is proved, it could solve the major enigma of quantum entanglement and non-locality; already defined within the context of the D-net model, but at the same time would require a complete reformulation of the laws of causality.
Measuring one entangled particle could send a wave backwards through time to the moment at which the pair was created without being limited by the speed of light. Retro-causality might also help explain why we live in a universe so finely tuned to the existence of life as Paul Davies has speculated, The universe may be able to retrofit its parameters using reverse causality to ensure the emergence of life as we know it.
D-Net provides a possible mechanism to achieve such a reverse engineering process, by allowing retracing of the pathways encoded as evolutionary histories. The parameters of the big bang could then be adjusted by a reverse evolutionary process to provide a way of accelerating the selection of a form of life capable of evolving into a more efficient information processor. The presence of intelligent observers later in history could therefore exert an influence on the big bang to reverse engineer the most appropriate conditions for the emergence of an optimally efficient form of life.
The D-Net model would allow this possibility because a causal network would allow both reverse or forward pathways to be established as the most efficient history.
Biologically it has already been shown that life can be reverse engineered, for example to produce chickens with teeth and snakes with limbs, by reversing evolutionary pathways.
In addition the latest interpretation of the Anthropic Principle suggests a much broader range of physical settings governing the creation of a universe which could lead to the evolution of life as we know it(9). For example an alternate pathway has recently been formulated allowing life to evolve without the existence of the weak force. It may be possible to define many other evolutionary pathways that could result in an alternate intelligent life form, capable of efficiently processing information.
Several experiments have been devised to test the notion of reverse or retro-causality including a variation of the classical quantum mechanics wave/particle experiment, delaying measurement until after the of photons had passed through the double slit. However this was not is not deemed sufficiently rigorous as a proof, as the path the photon took before the measurement could not be verified.
Physicist John Cramer has recently proposed a more rigorous test using the ‘transactional interpretation’ of quantum mechanics, which proposes that particles interact by sending and receiving physical waves that travel backwards and forwards through time. He has devised an experiment based on the entanglement of photons and their properties such as momentum, which then share the same wave or particle behaviour (7) (8) .
Pairs of photons from a laser beam are entangled before being sent along two different paths. One stream is delayed by several microseconds by sending it through a 10k optical fibre cable. A moveable detector can then be used to sense a photon in two positions, as either a wave or particle.
Choosing to measure the delayed photons as waves or particles forces the photons in the other beam to be detected the same way, because they are entangled. This choice therefore influences the measurement of the entangled photon even though it is made earlier.
If retro-causality is proved, it could solve the major enigma of quantum entanglement and non-locality; already defined within the context of the D-net model, but at the same time would require a complete reformulation of the laws of causality.
Measuring one entangled particle could send a wave backwards through time to the moment at which the pair was created without being limited by the speed of light. Retro-causality might also help explain why we live in a universe so finely tuned to the existence of life as Paul Davies has speculated, The universe may be able to retrofit its parameters using reverse causality to ensure the emergence of life as we know it.
D-Net provides a possible mechanism to achieve such a reverse engineering process, by allowing retracing of the pathways encoded as evolutionary histories. The parameters of the big bang could then be adjusted by a reverse evolutionary process to provide a way of accelerating the selection of a form of life capable of evolving into a more efficient information processor. The presence of intelligent observers later in history could therefore exert an influence on the big bang to reverse engineer the most appropriate conditions for the emergence of an optimally efficient form of life.
The D-Net model would allow this possibility because a causal network would allow both reverse or forward pathways to be established as the most efficient history.
Biologically it has already been shown that life can be reverse engineered, for example to produce chickens with teeth and snakes with limbs, by reversing evolutionary pathways.
In addition the latest interpretation of the Anthropic Principle suggests a much broader range of physical settings governing the creation of a universe which could lead to the evolution of life as we know it(9). For example an alternate pathway has recently been formulated allowing life to evolve without the existence of the weak force. It may be possible to define many other evolutionary pathways that could result in an alternate intelligent life form, capable of efficiently processing information.
Friday, May 16, 2008
The Unitary Quantum Connection
Computing is a process of mapping from inputs to outputs. A quantum computer performs a physical mapping from initial quantum states to final classical states. This solution mapping process is called the ‘unitary evolution’ (3) of the states of the system.
Solving a problem also represents a mapping from an information input parameter set to an output solution set. NP problems do not allow solutions to be reached under the constraint of serial computing because, as the input parameter space expands, the number of possible mapping pathways to reach an output solution also expands exponentially. NP problem solutions can be more efficiently solved by parallel quantum computing, the power of which increases exponentially at a rate comparable to the exponential increase in the solution set pathways or unitary evolution of the system. This can be modelled as a ‘sum over histories’ or path integral formulation of quantum theory.
The D-Net model is also supported by the Unitary Quantum Transform process as follows-
As with computing and problem solving processes, evolution also may be thought of as a mapping from the system’s present information state to the target state of the environment. The Action in this case is the integral of the decision paths over an ‘evolutionary field’. The Least Action minimises this integral which is the mathematical process in the author’s theory that drives evolution. The D-Net evolutionary decision network provides a model of this mapping mechanism. The model predicts many different decision paths in the configuration space from input to output, but only a small proportion- the most efficient, are relevant to the system. The other paths are those with the smallest probability of success and can be cancelled out.
Each transition path, including the decision nodes embedded in it, creates a decision history. The sum of these decision histories, provides the evolutionary mapping function and may be integrated to provide the quantum decision integral or sum over histories, for the evolutionary process. Life therefore is an extremely efficient information processor because it incorporates adaptive learning decision pathways that over time maximise the efficiency of the information differential reduction process, within a unitary evolution framework.
In a sense the decision paths become shorter and more efficient, enabling the optimal evolutionary outcome or unitary transform to be achieved at the lowest energy cost,
just as the geodesic in Riemann geometry represents the shortest or most efficient path between two points. There therefore appears to be a strong connection between Riemann geometric functions and evolutionary decision functions, which allows the formulation of a metric to calculate the efficiency of the evolutionary process. In effect the minimal geodesic distance between the information state of the observing system and the target information state of the system’s environment, applying a unitary operator, is equivalent to the minimal number of ‘decision gates or nodes’ and the number of connecting links or steps required to achieve the appropriate adaptation.
In this process the complexity of the system is increased, through the acquisition of additional information, plus greater flexibility and efficiency in resolving the differential, which can therefore be applied to adapt to more complex future environmental challenges. Such a metric would therefore be a function of the density of decision operations or transformations required to implement the unitary evolutionary transform plus a measure of the efficiency of the information feedback mechanism to keep the evolutionary process on track. In effect, the evolution of the ‘decision matrix’ applied to minimise the information differential, allows the evolution of the system as a whole to be achieved. This in turn could be calculated from Frieden’s Theory combined with existing Quantum Information, Unitary and Network Theory.
Solving a problem also represents a mapping from an information input parameter set to an output solution set. NP problems do not allow solutions to be reached under the constraint of serial computing because, as the input parameter space expands, the number of possible mapping pathways to reach an output solution also expands exponentially. NP problem solutions can be more efficiently solved by parallel quantum computing, the power of which increases exponentially at a rate comparable to the exponential increase in the solution set pathways or unitary evolution of the system. This can be modelled as a ‘sum over histories’ or path integral formulation of quantum theory.
The D-Net model is also supported by the Unitary Quantum Transform process as follows-
As with computing and problem solving processes, evolution also may be thought of as a mapping from the system’s present information state to the target state of the environment. The Action in this case is the integral of the decision paths over an ‘evolutionary field’. The Least Action minimises this integral which is the mathematical process in the author’s theory that drives evolution. The D-Net evolutionary decision network provides a model of this mapping mechanism. The model predicts many different decision paths in the configuration space from input to output, but only a small proportion- the most efficient, are relevant to the system. The other paths are those with the smallest probability of success and can be cancelled out.
Each transition path, including the decision nodes embedded in it, creates a decision history. The sum of these decision histories, provides the evolutionary mapping function and may be integrated to provide the quantum decision integral or sum over histories, for the evolutionary process. Life therefore is an extremely efficient information processor because it incorporates adaptive learning decision pathways that over time maximise the efficiency of the information differential reduction process, within a unitary evolution framework.
In a sense the decision paths become shorter and more efficient, enabling the optimal evolutionary outcome or unitary transform to be achieved at the lowest energy cost,
just as the geodesic in Riemann geometry represents the shortest or most efficient path between two points. There therefore appears to be a strong connection between Riemann geometric functions and evolutionary decision functions, which allows the formulation of a metric to calculate the efficiency of the evolutionary process. In effect the minimal geodesic distance between the information state of the observing system and the target information state of the system’s environment, applying a unitary operator, is equivalent to the minimal number of ‘decision gates or nodes’ and the number of connecting links or steps required to achieve the appropriate adaptation.
In this process the complexity of the system is increased, through the acquisition of additional information, plus greater flexibility and efficiency in resolving the differential, which can therefore be applied to adapt to more complex future environmental challenges. Such a metric would therefore be a function of the density of decision operations or transformations required to implement the unitary evolutionary transform plus a measure of the efficiency of the information feedback mechanism to keep the evolutionary process on track. In effect, the evolution of the ‘decision matrix’ applied to minimise the information differential, allows the evolution of the system as a whole to be achieved. This in turn could be calculated from Frieden’s Theory combined with existing Quantum Information, Unitary and Network Theory.
The Quantum Information Connection
At the physical level, the latest theories place information at the heart of the process driving the evolution of the universe. Loop Quantum Gravity- LQG confirms the process of evolution of the cosmos through flows of information through networked channels of spacetime.
It is understood that the laws which govern emergent phenomena such as life, consciousness and advanced intelligence are based on processes of self organization, information and complexity which cannot be logically derived from the underlying laws of physical forces and matter. According to the author’s generic evolutionary theory, they may however be derivable from the laws of information theory such as Frieden’s Information Action Law (2). Frieden’s Law is based on Fisher Information ‘I’, which measures how much information can be extracted from a physical system given all the errors or uncertainties relating to its quantum and statistical state. This principle builds on the well known idea that the observation of a ‘source’ phenomenon is never completely accurate. Information is inevitably lost in the transition process from source to observation and random errors may also further degrade the observation.
It is known that physical laws can be derived from Lagrangian mathematical functions in the form of differential equations. Lagrangians measure the difference between two quantities defining a system, such as its kinetic and potential energy. This differential is called the ‘action’. Minimising the Action generates the differential equations and laws governing the evolution of the system. The significance of the Action in formulating the laws of physics is understood by physicists but the underlying principle is not.
Frieden suggests it relates to the information bound up in the system. The Lagrangian equations for Frieden’s Law are generated from the difference between ‘I’ the Fisher information or how much information can be extracted from the system and ‘J’ the amount of information needed to provide the best possible description of the phenomena. The appropriate Lagrangian function ‘L’ can then be derived from the difference between these two information measures, which when minimised produces the laws governing the system, in the form of differential equations. Frieden’s theory is therefore based on an efficient variational principle which can be used to derive most of the fundamental laws of physics, as well as laws of biology, chemistry and economics. It therefore offers, in the author’s interpretation of Frieden’s formulation, a bridge for deriving the laws of nature in general and evolution in particular.
The major physical laws relating to relativity, electromagnetism and quantum mechanics have all been derived using this method. For example the central law of quantum theory, which describes the way particles move through time and space has been derived directly from Frieden’s Least Action information principles. Similarly Einstein’s theory can be derived from a Lagrangian approach as was first shown by the great mathematician David Hilbert. Therefore it is proposed that the laws and equations governing evolution as a physical process can also be derived from Frieden’s theory, broadly as follows.
Let the maximum information capable of being extracted from the system’s environment be represented by a function of Fisher information ¦(I).
Let the information potential of the observer system to match this information be represented by a function of Frieden’s ‘J’ information measure- ¦(J).
Frieden’s Information Action is then the integral of the Lagrange function of the normalised difference between these two measures- ¦(I)- ¦(J).
The capacity of the system to minimise this action or achieve a least or minimum action information differential between these two statistical parameters, as proposed in the author’s D-Net model, provides the required Lagrangian function ‘L’ for the process of evolution.
Then by integrating over the decision paths comprising the configuration space of the evolutionary field defined over (I-J), the Action for the evolutionary process, provided by the formula- Ev = Integral L (¦I-¦J) can be defined and is analogous to the path integral formulation of quantum field theory.
Finding the minimum Ev over the Decision or Configuration Space (I- J) therefore provides the dynamical time equations governing the evolutionary trajectory of the system and is why the ‘information action’ is so important. It provides the underlying basis for realising the process of evolution in our universe.
It is recognised that organising principles in nature emerge at successive levels of complexity and that the universe as a whole possesses such a tendency to develop towards progressively higher levels of complex organisation. The thesis outlined in this book supports this hypothesis - that evolution is the primary organising principle leading to more complex organisation in the universe. The universe requires life as an efficient information processor to achieve this state of complex organisation. Within the evolutionary process, the observer is Life, defining its reality in relation to its environment. This reiterates a key premise of quantum theory; that through the act of observation are defined the laws of physics, with reality created by the observer’s participation. The universe therefore realises its existence and potential in terms of life’s evolution.
Frieden’s Information Law is the second major evidential basis supporting the author’s unified evolutionary theory.
In addition it should allow the derivation of the precise mathematical formulation from differential equations, supporting a decision-based model of evolution such as D-Net. This has been implemented within the Unitary Quantum Evolution(3) and Cosmic Spacetime Network frameworks defined below.
It is understood that the laws which govern emergent phenomena such as life, consciousness and advanced intelligence are based on processes of self organization, information and complexity which cannot be logically derived from the underlying laws of physical forces and matter. According to the author’s generic evolutionary theory, they may however be derivable from the laws of information theory such as Frieden’s Information Action Law (2). Frieden’s Law is based on Fisher Information ‘I’, which measures how much information can be extracted from a physical system given all the errors or uncertainties relating to its quantum and statistical state. This principle builds on the well known idea that the observation of a ‘source’ phenomenon is never completely accurate. Information is inevitably lost in the transition process from source to observation and random errors may also further degrade the observation.
It is known that physical laws can be derived from Lagrangian mathematical functions in the form of differential equations. Lagrangians measure the difference between two quantities defining a system, such as its kinetic and potential energy. This differential is called the ‘action’. Minimising the Action generates the differential equations and laws governing the evolution of the system. The significance of the Action in formulating the laws of physics is understood by physicists but the underlying principle is not.
Frieden suggests it relates to the information bound up in the system. The Lagrangian equations for Frieden’s Law are generated from the difference between ‘I’ the Fisher information or how much information can be extracted from the system and ‘J’ the amount of information needed to provide the best possible description of the phenomena. The appropriate Lagrangian function ‘L’ can then be derived from the difference between these two information measures, which when minimised produces the laws governing the system, in the form of differential equations. Frieden’s theory is therefore based on an efficient variational principle which can be used to derive most of the fundamental laws of physics, as well as laws of biology, chemistry and economics. It therefore offers, in the author’s interpretation of Frieden’s formulation, a bridge for deriving the laws of nature in general and evolution in particular.
The major physical laws relating to relativity, electromagnetism and quantum mechanics have all been derived using this method. For example the central law of quantum theory, which describes the way particles move through time and space has been derived directly from Frieden’s Least Action information principles. Similarly Einstein’s theory can be derived from a Lagrangian approach as was first shown by the great mathematician David Hilbert. Therefore it is proposed that the laws and equations governing evolution as a physical process can also be derived from Frieden’s theory, broadly as follows.
Let the maximum information capable of being extracted from the system’s environment be represented by a function of Fisher information ¦(I).
Let the information potential of the observer system to match this information be represented by a function of Frieden’s ‘J’ information measure- ¦(J).
Frieden’s Information Action is then the integral of the Lagrange function of the normalised difference between these two measures- ¦(I)- ¦(J).
The capacity of the system to minimise this action or achieve a least or minimum action information differential between these two statistical parameters, as proposed in the author’s D-Net model, provides the required Lagrangian function ‘L’ for the process of evolution.
Then by integrating over the decision paths comprising the configuration space of the evolutionary field defined over (I-J), the Action for the evolutionary process, provided by the formula- Ev = Integral L (¦I-¦J) can be defined and is analogous to the path integral formulation of quantum field theory.
Finding the minimum Ev over the Decision or Configuration Space (I- J) therefore provides the dynamical time equations governing the evolutionary trajectory of the system and is why the ‘information action’ is so important. It provides the underlying basis for realising the process of evolution in our universe.
It is recognised that organising principles in nature emerge at successive levels of complexity and that the universe as a whole possesses such a tendency to develop towards progressively higher levels of complex organisation. The thesis outlined in this book supports this hypothesis - that evolution is the primary organising principle leading to more complex organisation in the universe. The universe requires life as an efficient information processor to achieve this state of complex organisation. Within the evolutionary process, the observer is Life, defining its reality in relation to its environment. This reiterates a key premise of quantum theory; that through the act of observation are defined the laws of physics, with reality created by the observer’s participation. The universe therefore realises its existence and potential in terms of life’s evolution.
Frieden’s Information Law is the second major evidential basis supporting the author’s unified evolutionary theory.
In addition it should allow the derivation of the precise mathematical formulation from differential equations, supporting a decision-based model of evolution such as D-Net. This has been implemented within the Unitary Quantum Evolution(3) and Cosmic Spacetime Network frameworks defined below.
Thursday, May 15, 2008
The Biological Connection
All living systems apply information processing whether at the cellular, genetic or neural level to realise their potential through the evolutionary process.
Proof of this principle at the fundamental bacterium level has recently been published in Science magazine and supports the author’s theory from a biological framework perspective (1). This is the first evidential basis offered.
According to this research, bacteria have the ability to keep track of the level of information uncertainty and entropy in their environment with the option of reacting to minimise it through genetic operators such as mutations, in synchronicity with the rate of change of the level of uncertainty.
Higher level organisms have long since adopted similar sophisticated adaptive methods, including DNA/ RNA transcription editing, RNA interference and use of epigenetic factors such as methyl groups to react quickly to changes in the environment. But even simple organisms such as bacteria and viruses are surprisingly responsive and capable of monitoring and reacting to their environment. Random mutation together with genetic recombination is now disputed as the only evolutionary method available. This simple basis would have taken far too long to achieve the improvements in flexibility needed to ensure ongoing survival. It is now understood that there are more direct feedback loops available to life at all levels, providing information on external change and maximising opportunities to fast forward to better solutions within environmentally manageable time frames.
At Princeton university in 2007, physicist William Bialek’s team has provided further evidence of the critical link between life’s evolution and its capacity to access and interpret the information in its external environment, using the single cell bacterium Ecoli as a test model (10 ) ,
Each bacterium uses lac proteins to break down its food, the sugar lactose. Knowing how much lac protein it should produce to maximise its use of the available sugar in its environment provides it with a competitive edge over rival cells.
This relationship between the capacity of Ecoli to produce the optimum quantity of lac protein and the amount of information available to its gene regulatory network has now been mathematically determined viz-
One bit of information available to the bacterium relating to the level of lac protein production required, which in turn governs the cell’s capacity to turn one lac protein on or off, which confers a 5 percent ‘fitness advantage’ over bacteria storing no bits.
This suggests that there is an information minimum threshold for life’s survival according to Bialek and that natural selection favours organisms that capture more
information about their environment.
It also provides direct evidence supporting the thesis of this book that the process of evolution is governed by the level of access of the system to relevant information in its environment.
Life has already learned that cooperation can greatly leverage the information generation and sifting process. Multi-cellular life, ant colonies and societies of all species apply this principle, leading inevitably, as a number of eminent philosophers and physicists such as Frank Tipler have postulated, to the emergence of a super cooperative entity with the attributes of a global consciousness- defined in this book as ‘Omega’. Omega is marked by instant communication between its intelligent sub-units or information processor life forms, acting in concert, continuously exchanging and amplifying knowledge. Such an Omega entity will therefore continue to evolve towards the structure of an infinitely complex, multidimensional and networked system. This is not a purely philosophical concept, but an inevitable consequence of the new unified evolutionary theory.
Modern humans as well as all other species to varying degrees have acquired the capacity to react and modify behaviour and search for requisite sources of information and knowledge to assist in adapting to a fast changing social as well as physical environment. Humans are particularly adept at adapting to cultural change, with the rapid uptake of new cyber-age and globalisation lifestyle by the current generation the most pervasive example. The evolutionary process therefore not only guarantees efficient information processing but continually selects on a reward basis the techniques and protocols that result in more efficient information processing outcomes. This extends the potential of life in the universe and the universe itself, as life becomes coexistent with it.
Cooperation also amplifies knowledge by allowing better sharing and generation of diverse knowledge sources by life, resulting in greater complexity, with the capability of withstanding harsher shocks from its external environment. As in any cellular process, information is exchanged between its component populations via networked channels. It should also noted that the brain has evolved information acquisition mechanisms that reward life for learning about its environment, based on the release of opoid neurotransmitters when new ideas are absorbed. This confers an evolutionary advantage and ensures that information processing continues unabated for all life.
Life has already learned that cooperation can greatly leverage the information generation and sifting process. Multi-cellular life, ant colonies and societies of all species apply this principle, leading inevitably, as a number of eminent philosophers and physicists such as Frank Tipler have postulated, to the emergence of a super cooperative entity with the attributes of a global consciousness- defined in this book as ‘Omega’. Omega is marked by instant communication between its intelligent sub-units or information processor life forms, acting in concert, continuously exchanging and amplifying knowledge. Such an Omega entity will therefore continue to evolve towards the structure of an infinitely complex, multidimensional and networked system. This is not a purely philosophical concept, but an inevitable consequence of the new unified evolutionary theory.
Proof of this principle at the fundamental bacterium level has recently been published in Science magazine and supports the author’s theory from a biological framework perspective (1). This is the first evidential basis offered.
According to this research, bacteria have the ability to keep track of the level of information uncertainty and entropy in their environment with the option of reacting to minimise it through genetic operators such as mutations, in synchronicity with the rate of change of the level of uncertainty.
Higher level organisms have long since adopted similar sophisticated adaptive methods, including DNA/ RNA transcription editing, RNA interference and use of epigenetic factors such as methyl groups to react quickly to changes in the environment. But even simple organisms such as bacteria and viruses are surprisingly responsive and capable of monitoring and reacting to their environment. Random mutation together with genetic recombination is now disputed as the only evolutionary method available. This simple basis would have taken far too long to achieve the improvements in flexibility needed to ensure ongoing survival. It is now understood that there are more direct feedback loops available to life at all levels, providing information on external change and maximising opportunities to fast forward to better solutions within environmentally manageable time frames.
At Princeton university in 2007, physicist William Bialek’s team has provided further evidence of the critical link between life’s evolution and its capacity to access and interpret the information in its external environment, using the single cell bacterium Ecoli as a test model (10 ) ,
Each bacterium uses lac proteins to break down its food, the sugar lactose. Knowing how much lac protein it should produce to maximise its use of the available sugar in its environment provides it with a competitive edge over rival cells.
This relationship between the capacity of Ecoli to produce the optimum quantity of lac protein and the amount of information available to its gene regulatory network has now been mathematically determined viz-
One bit of information available to the bacterium relating to the level of lac protein production required, which in turn governs the cell’s capacity to turn one lac protein on or off, which confers a 5 percent ‘fitness advantage’ over bacteria storing no bits.
This suggests that there is an information minimum threshold for life’s survival according to Bialek and that natural selection favours organisms that capture more
information about their environment.
It also provides direct evidence supporting the thesis of this book that the process of evolution is governed by the level of access of the system to relevant information in its environment.
Life has already learned that cooperation can greatly leverage the information generation and sifting process. Multi-cellular life, ant colonies and societies of all species apply this principle, leading inevitably, as a number of eminent philosophers and physicists such as Frank Tipler have postulated, to the emergence of a super cooperative entity with the attributes of a global consciousness- defined in this book as ‘Omega’. Omega is marked by instant communication between its intelligent sub-units or information processor life forms, acting in concert, continuously exchanging and amplifying knowledge. Such an Omega entity will therefore continue to evolve towards the structure of an infinitely complex, multidimensional and networked system. This is not a purely philosophical concept, but an inevitable consequence of the new unified evolutionary theory.
Modern humans as well as all other species to varying degrees have acquired the capacity to react and modify behaviour and search for requisite sources of information and knowledge to assist in adapting to a fast changing social as well as physical environment. Humans are particularly adept at adapting to cultural change, with the rapid uptake of new cyber-age and globalisation lifestyle by the current generation the most pervasive example. The evolutionary process therefore not only guarantees efficient information processing but continually selects on a reward basis the techniques and protocols that result in more efficient information processing outcomes. This extends the potential of life in the universe and the universe itself, as life becomes coexistent with it.
Cooperation also amplifies knowledge by allowing better sharing and generation of diverse knowledge sources by life, resulting in greater complexity, with the capability of withstanding harsher shocks from its external environment. As in any cellular process, information is exchanged between its component populations via networked channels. It should also noted that the brain has evolved information acquisition mechanisms that reward life for learning about its environment, based on the release of opoid neurotransmitters when new ideas are absorbed. This confers an evolutionary advantage and ensures that information processing continues unabated for all life.
Life has already learned that cooperation can greatly leverage the information generation and sifting process. Multi-cellular life, ant colonies and societies of all species apply this principle, leading inevitably, as a number of eminent philosophers and physicists such as Frank Tipler have postulated, to the emergence of a super cooperative entity with the attributes of a global consciousness- defined in this book as ‘Omega’. Omega is marked by instant communication between its intelligent sub-units or information processor life forms, acting in concert, continuously exchanging and amplifying knowledge. Such an Omega entity will therefore continue to evolve towards the structure of an infinitely complex, multidimensional and networked system. This is not a purely philosophical concept, but an inevitable consequence of the new unified evolutionary theory.
Tuesday, May 13, 2008
Self-Organising System Characteristics
As defined in the previous post, open systems are driven from equilibrium because of the impact of an evolving environment producing stress, which eventually causes the system to morph into one or several new configurations. For example a bird species morphing into two separate species in response to climate change or a string energy landscape tunnelling into a lower energy topology at the Planck level in response to quantum fluctuations.
In string theory this may have the effect of releasing sufficient energy to create a new bubble universe. This bifurcation or splitting into new information states produces new configurations which match the changing environment more closely. In classical Darwinian terms, the environment has selected the new state(s) as a better match over the original architecture. Stress on a system requires the need for a new information injection which allows increased complexity to be achieved. The need to survive or realise greater potential drives the accrual of new information, causing the system to seek new options.
Once the system splits it then reaches a new equilibrium state, in effect it has self-organised its network configuration into a lower information and energy differential. Again the environment diverges and so the process repeats ad infinitum. It will also increase the thermodynamic entropy of its environment by exporting its waste energy, evading degeneration by importing information and exporting entropy.
Self-organising systems, organic or inorganic, therefore try to reach equilibrium with their environments, whether in the form of life, ecosystems, galaxies, molecular chemical solutions or black holes. The form of equilibrium reached may be in relation to any energy form- thermal, kinetic, electromagnetic or nuclear. All can ultimately be associated with and measured in terms of information states.
In each case the process is the same- the system minimises its energy or information differential to achieve equilibrium.
Therefore the process of self-organisation is equivalent to evolution.
The inevitability of the variation in the constants and laws of nature can also be better understood within the D-Net framework as follows-
All parameters of the universe including the so-called constants of nature- the masses and charges of elementary particles as well as the speed of light and Planck’s constant, are emergent properties of the constantly evolving braiding networks of space-time, as they accommodate within the larger structure of the multiverse. This is the same process of self-organisation or adaptation that drives all other physical phenomena and processes.
As our particular universe self-organises to minimise its energy and information differential in relation to the larger multiverse environment, these parameters or constants continuously vary, as also do the relationships between them- the Laws of Nature. This never-ending evolutionary process in turn alters the environment of the larger universe or system, which alters its system’s environment and so the evolutionary process continues ad infinitum.
Life as both observer and efficient information processor converts entropy or unknown information states to known information states, reducing entropy within the interactive observer system. Therefore the universe can both increase its organisational complexity and at the same time maintain creative unidirectional progress in the face of the second law of thermodynamics. This is life’s crucial significance within the cosmos.
A system such as life is enjoined with all ecosystems within its environment, including not only the biodiversity and biosphere of planet earth but the cosmos at large.
Its survival is therefore intimately linked to the survival of the universe.
Therefore the universe can both increase its level of organisational complexity and entropy and at the same time and can maintain creative unidirectional progress in the face of the second law of thermodynamics. A system such as life is enjoined with all other ecosystems of its environment, including not only the biodiversity and biosphere of planet earth, but the cosmos at large. Its survival is therefore intimately linked to the survival of the universe.
The validity of the D-Net model depends on its deep connection to both the latest biological and physical theories. These are now discussed.
In string theory this may have the effect of releasing sufficient energy to create a new bubble universe. This bifurcation or splitting into new information states produces new configurations which match the changing environment more closely. In classical Darwinian terms, the environment has selected the new state(s) as a better match over the original architecture. Stress on a system requires the need for a new information injection which allows increased complexity to be achieved. The need to survive or realise greater potential drives the accrual of new information, causing the system to seek new options.
Once the system splits it then reaches a new equilibrium state, in effect it has self-organised its network configuration into a lower information and energy differential. Again the environment diverges and so the process repeats ad infinitum. It will also increase the thermodynamic entropy of its environment by exporting its waste energy, evading degeneration by importing information and exporting entropy.
Self-organising systems, organic or inorganic, therefore try to reach equilibrium with their environments, whether in the form of life, ecosystems, galaxies, molecular chemical solutions or black holes. The form of equilibrium reached may be in relation to any energy form- thermal, kinetic, electromagnetic or nuclear. All can ultimately be associated with and measured in terms of information states.
In each case the process is the same- the system minimises its energy or information differential to achieve equilibrium.
Therefore the process of self-organisation is equivalent to evolution.
The inevitability of the variation in the constants and laws of nature can also be better understood within the D-Net framework as follows-
All parameters of the universe including the so-called constants of nature- the masses and charges of elementary particles as well as the speed of light and Planck’s constant, are emergent properties of the constantly evolving braiding networks of space-time, as they accommodate within the larger structure of the multiverse. This is the same process of self-organisation or adaptation that drives all other physical phenomena and processes.
As our particular universe self-organises to minimise its energy and information differential in relation to the larger multiverse environment, these parameters or constants continuously vary, as also do the relationships between them- the Laws of Nature. This never-ending evolutionary process in turn alters the environment of the larger universe or system, which alters its system’s environment and so the evolutionary process continues ad infinitum.
Life as both observer and efficient information processor converts entropy or unknown information states to known information states, reducing entropy within the interactive observer system. Therefore the universe can both increase its organisational complexity and at the same time maintain creative unidirectional progress in the face of the second law of thermodynamics. This is life’s crucial significance within the cosmos.
A system such as life is enjoined with all ecosystems within its environment, including not only the biodiversity and biosphere of planet earth but the cosmos at large.
Its survival is therefore intimately linked to the survival of the universe.
Therefore the universe can both increase its level of organisational complexity and entropy and at the same time and can maintain creative unidirectional progress in the face of the second law of thermodynamics. A system such as life is enjoined with all other ecosystems of its environment, including not only the biodiversity and biosphere of planet earth, but the cosmos at large. Its survival is therefore intimately linked to the survival of the universe.
The validity of the D-Net model depends on its deep connection to both the latest biological and physical theories. These are now discussed.
Sunday, May 11, 2008
Open System Characteristics
Open system frameworks are essential to the evolution of life because they allow for the flow of energy and information between the system and its environment. Closed systems wither and die.
Open systems are dynamic and constantly changing, but can never reach equilibrium because of the inherent unpredictability and uncertainty of their environments, which in turn are open systems and because all systems undergo unpredictable, random fluctuations in accordance with quantum theory. This uncertainty continues to create an information differential, which the system must monitor and attempt to minimise by the acquisition of additional information and processing capacity, including learning, computational and structural modification. The further an open system is from equilibrium within its environment the more pressure it will be under to adapt and evolve to a new state of greater complexity.
A recent discovery in biology reveals that even bacteria engage in monitoring their environment and adapt directly to uncertainty through the process of ‘bet hedging’. This allows an organism to shift to a new configuration of its cellular networks and protein expression through nucleotide mutation and editing. This results in greater complexity of form, function and survivability because the new information selected is based on its potential capacity of the bacterium to create a larger and more efficient solution space.
The cybernetic law of requisite variety also requires that systems acquire sufficient information to match the information capacity of their environment. Therefore in order to survive a system must be coupled to its environment and that environment in turn is nested within a larger system in the manner of ‘Russian Dolls’. But as each system seeks to adapt and transform, it also triggers changes within its immediate environment, which ripple out to wider systems and environments. And so the process continues, with entropy increasing in the wider universe, but at the same time creating more information and greater complexity for life. This extends the life of each living system and therefore of life as a whole. Although life appears doomed in a particular universe when insufficient energy is available to generate the requisite information to negate entropy, according to the eminent physicist Frank Tipler, by processing an infinite amount of information in subjective time, life may extend its survival indefinitely.
All open systems are therefore connected with and exert influence on their external environments through boundary and network enmeshments. These networks are the equivalent of Loop Quantum Gravity spacetime networks at the Planck level. LQG networks encode their links as volumes and areas of space allowing information to pass from observer to observer, each with a partial view of reality. The network encodes processes by which information is conveyed from one part of the universe to another. But it is important to note that although LQG networks are modelled independently of any preferred set of spacetime coordinates, this level of abstraction can be extended further through decision networks. The information channels and decision nodes of the D-Net model are completely independent of any notion of physical space. Information is selected by the evolutionary network on the basis of its information value to the system in terms of its relationship to other systems.
Open systems are dynamic and constantly changing, but can never reach equilibrium because of the inherent unpredictability and uncertainty of their environments, which in turn are open systems and because all systems undergo unpredictable, random fluctuations in accordance with quantum theory. This uncertainty continues to create an information differential, which the system must monitor and attempt to minimise by the acquisition of additional information and processing capacity, including learning, computational and structural modification. The further an open system is from equilibrium within its environment the more pressure it will be under to adapt and evolve to a new state of greater complexity.
A recent discovery in biology reveals that even bacteria engage in monitoring their environment and adapt directly to uncertainty through the process of ‘bet hedging’. This allows an organism to shift to a new configuration of its cellular networks and protein expression through nucleotide mutation and editing. This results in greater complexity of form, function and survivability because the new information selected is based on its potential capacity of the bacterium to create a larger and more efficient solution space.
The cybernetic law of requisite variety also requires that systems acquire sufficient information to match the information capacity of their environment. Therefore in order to survive a system must be coupled to its environment and that environment in turn is nested within a larger system in the manner of ‘Russian Dolls’. But as each system seeks to adapt and transform, it also triggers changes within its immediate environment, which ripple out to wider systems and environments. And so the process continues, with entropy increasing in the wider universe, but at the same time creating more information and greater complexity for life. This extends the life of each living system and therefore of life as a whole. Although life appears doomed in a particular universe when insufficient energy is available to generate the requisite information to negate entropy, according to the eminent physicist Frank Tipler, by processing an infinite amount of information in subjective time, life may extend its survival indefinitely.
All open systems are therefore connected with and exert influence on their external environments through boundary and network enmeshments. These networks are the equivalent of Loop Quantum Gravity spacetime networks at the Planck level. LQG networks encode their links as volumes and areas of space allowing information to pass from observer to observer, each with a partial view of reality. The network encodes processes by which information is conveyed from one part of the universe to another. But it is important to note that although LQG networks are modelled independently of any preferred set of spacetime coordinates, this level of abstraction can be extended further through decision networks. The information channels and decision nodes of the D-Net model are completely independent of any notion of physical space. Information is selected by the evolutionary network on the basis of its information value to the system in terms of its relationship to other systems.
Monday, May 5, 2008
A New Evolutionary Model- Higher Principles
One of the central tasks of physics is the identification of higher organising principles in nature. These manifest as emergent laws and phenomena that have triggered major shifts in the evolution of life and the universe since its genesis almost 14 billion years ago. An overarching Universal Principle is required to integrate and unify the many disparate theories, models and laws upon which our scientific and cultural paradigms are based.
It is postulated that evolution is possibly the paramount principle as defined at a much broader and deeper level than previously realised. It provides not only a framework for the biological processes based on the Darwinian model, but a major overarching paradigm to rationalise all processes and phenomena comprising the universe at large.
The author’s hypothesis proposes that life, the universe and everything is driven at the core by a far deeper, more complex and enigmatic manifestation of the evolutionary process based on the principles that follow.
It is further proposed that these principles form a comprehensive basis for a formal proof of a complete information based unified theory of evolution. The theory defined in this project is the Decision-Network Evolutionary Theory or D-Net.
Principle 1- Evolution is an adaptive decision/information-based convergent process that occurs at all levels and for all forms of process complexity- physical, biological and social. Adaptation is a common language for both living and non-living systems.
At the core, it involves reducing the information differential between a system and its environment.
Principle 2- Life as we know it is an outcome of the evolutionary process because it continually reinforces the capacity for more complex information processing.
As information is continually generated by life and its support systems, it inevitably leads to more complex life and therefore more efficient information processing systems. Life therefore, as an outcome of this complexity gain, represents a continually evolving class of efficient information processors.
Principle 3- Evolution is an adaptive process that seeks to minimise the functional differential between a system and its environment by increasing its information processing capacity. This hypothesis is supported by the recently discovered genetic process of ‘bet hedging’ in bacteria, as well as adaptive learning processes in all living systems. In addition, the Principle of Least Action of physics as adapted by Roy Frieden to extract the laws of nature from information measurement, provides a powerful supporting mathematical framework for this theory.
Principle 4- Evolutionary based system transformations can be expressed in terms of information, network and decision theory and can be modelled by causal information/decision networks. In this model the network’s edges represent channels for information flows and network nodes the decision processes for selection of the most efficient information transformations governing system/environment interactions. Additionally, decision states or outcomes from each node may be modelled as quantum states in a Hilbert space.
Principle 5- Decision/network processes are an integral part of evolution, involving the selection of information that can manage the adaptation or information differential minimisation process. This in turn leads to the accrual of more complex decision capacity and value-based outcomes for the system. In other words, it represents a positive feedback loop that is increasingly beneficial to the system over time. It facilitates selection of the most appropriate information via decision probability outcomes. Information selection and injection using decision networks therefore reduces entropy within the system and increases complexity. In turn this increases the capacity for survival of life by eventually allowing an increasing amount of information to be processed in subjective time.
Principle 6- The decision network theory is based on a highly abstract model of quantum information processing, including the laws governing matter and forces. It therefore has much in common with the latest quantum gravity model of spacetime- Loop Quantum Gravity. As described in the previous chapter, this model maps spacetime and physical processes via the medium of networks representing quantised volumes, areas and in the latest model-qubits. Fundamental particles such as quarks and electrons are not primary in the LQG model but secondary outcomes of causal information flows generated and guided by such braided networks of quantized spacetime.
Principle 7- Evolution is therefore the basis of adaptation and increase in complexity, which is achieved through a process of directed information and energy flows managed by the system’s self-organising processes. Self-organization acts by continuously transforming the system’s structure and topology, minimising energy and resource usage to create the most efficient and effective evolutionary outcomes.
The system seeks to maximise its information processing and energy efficiency and minimise its energy costs in relation to its environment. This is represented in physical systems by the Principle of Least Action, as previously outlined. However, unlike competing theories of evolution, energy processing is secondary to the key requirement of optimal information management.
Principle 8 – Applied to living systems, evolution therefore results in an increase in organisational and computational complexity over time, which then results in an increase in the system’s environmental complexity through a process of reciprocal adaptation. Complex systems tend to be more stable as well as more adaptive than simpler ones. This provides improved selective advantage and ensures the system transforms over time to one of greater complexity.
Principle 9- the inevitable outcome of this positive reinforcement in life’s complexity and optimisation as a sophisticated knowledge processor, is the emergence of a seamlessly cooperative network or entity that eventually takes on the more abstract form of a global/universal consciousness- Omega, as described in the final chapter.
Life’s capacity to continually expand and improve its abilities as an extremely efficient processor of information, using the evolutionary process, is now provable as a scientific hypothesis. Since life emerged on this planet almost 4 billion years ago it has continued to increase its processing capability through the development of more flexible and adaptable cellular and neural structures as well as adaptive body forms. In addition, all life from multi-cellular organisms to human societies has learnt the value of cooperation. Humans, the most advanced information processors on earth, now utilise a variety of sophisticated mathematical and computational techniques including artificial intelligence, cellular automata, quantum computing, neural networks, evolutionary algorithms and biological computing, as well as vastly amplifying its problem solving capability through the internet’s massive computational intelligence.
This trajectory will continue at an exponential rate.
Systems, network, information and quantum theory therefore provides the theoretical basis for the unified evolutionary theory as postulated in this book. Some of the most critical aspects are now discussed.
It is postulated that evolution is possibly the paramount principle as defined at a much broader and deeper level than previously realised. It provides not only a framework for the biological processes based on the Darwinian model, but a major overarching paradigm to rationalise all processes and phenomena comprising the universe at large.
The author’s hypothesis proposes that life, the universe and everything is driven at the core by a far deeper, more complex and enigmatic manifestation of the evolutionary process based on the principles that follow.
It is further proposed that these principles form a comprehensive basis for a formal proof of a complete information based unified theory of evolution. The theory defined in this project is the Decision-Network Evolutionary Theory or D-Net.
Principle 1- Evolution is an adaptive decision/information-based convergent process that occurs at all levels and for all forms of process complexity- physical, biological and social. Adaptation is a common language for both living and non-living systems.
At the core, it involves reducing the information differential between a system and its environment.
Principle 2- Life as we know it is an outcome of the evolutionary process because it continually reinforces the capacity for more complex information processing.
As information is continually generated by life and its support systems, it inevitably leads to more complex life and therefore more efficient information processing systems. Life therefore, as an outcome of this complexity gain, represents a continually evolving class of efficient information processors.
Principle 3- Evolution is an adaptive process that seeks to minimise the functional differential between a system and its environment by increasing its information processing capacity. This hypothesis is supported by the recently discovered genetic process of ‘bet hedging’ in bacteria, as well as adaptive learning processes in all living systems. In addition, the Principle of Least Action of physics as adapted by Roy Frieden to extract the laws of nature from information measurement, provides a powerful supporting mathematical framework for this theory.
Principle 4- Evolutionary based system transformations can be expressed in terms of information, network and decision theory and can be modelled by causal information/decision networks. In this model the network’s edges represent channels for information flows and network nodes the decision processes for selection of the most efficient information transformations governing system/environment interactions. Additionally, decision states or outcomes from each node may be modelled as quantum states in a Hilbert space.
Principle 5- Decision/network processes are an integral part of evolution, involving the selection of information that can manage the adaptation or information differential minimisation process. This in turn leads to the accrual of more complex decision capacity and value-based outcomes for the system. In other words, it represents a positive feedback loop that is increasingly beneficial to the system over time. It facilitates selection of the most appropriate information via decision probability outcomes. Information selection and injection using decision networks therefore reduces entropy within the system and increases complexity. In turn this increases the capacity for survival of life by eventually allowing an increasing amount of information to be processed in subjective time.
Principle 6- The decision network theory is based on a highly abstract model of quantum information processing, including the laws governing matter and forces. It therefore has much in common with the latest quantum gravity model of spacetime- Loop Quantum Gravity. As described in the previous chapter, this model maps spacetime and physical processes via the medium of networks representing quantised volumes, areas and in the latest model-qubits. Fundamental particles such as quarks and electrons are not primary in the LQG model but secondary outcomes of causal information flows generated and guided by such braided networks of quantized spacetime.
Principle 7- Evolution is therefore the basis of adaptation and increase in complexity, which is achieved through a process of directed information and energy flows managed by the system’s self-organising processes. Self-organization acts by continuously transforming the system’s structure and topology, minimising energy and resource usage to create the most efficient and effective evolutionary outcomes.
The system seeks to maximise its information processing and energy efficiency and minimise its energy costs in relation to its environment. This is represented in physical systems by the Principle of Least Action, as previously outlined. However, unlike competing theories of evolution, energy processing is secondary to the key requirement of optimal information management.
Principle 8 – Applied to living systems, evolution therefore results in an increase in organisational and computational complexity over time, which then results in an increase in the system’s environmental complexity through a process of reciprocal adaptation. Complex systems tend to be more stable as well as more adaptive than simpler ones. This provides improved selective advantage and ensures the system transforms over time to one of greater complexity.
Principle 9- the inevitable outcome of this positive reinforcement in life’s complexity and optimisation as a sophisticated knowledge processor, is the emergence of a seamlessly cooperative network or entity that eventually takes on the more abstract form of a global/universal consciousness- Omega, as described in the final chapter.
Life’s capacity to continually expand and improve its abilities as an extremely efficient processor of information, using the evolutionary process, is now provable as a scientific hypothesis. Since life emerged on this planet almost 4 billion years ago it has continued to increase its processing capability through the development of more flexible and adaptable cellular and neural structures as well as adaptive body forms. In addition, all life from multi-cellular organisms to human societies has learnt the value of cooperation. Humans, the most advanced information processors on earth, now utilise a variety of sophisticated mathematical and computational techniques including artificial intelligence, cellular automata, quantum computing, neural networks, evolutionary algorithms and biological computing, as well as vastly amplifying its problem solving capability through the internet’s massive computational intelligence.
This trajectory will continue at an exponential rate.
Systems, network, information and quantum theory therefore provides the theoretical basis for the unified evolutionary theory as postulated in this book. Some of the most critical aspects are now discussed.
Higher Principles
One of the central tasks of physics is the identification of higher organising principles in nature. These manifest as emergent laws and phenomena that have triggered major shifts in the evolution of life and the universe since its genesis almost 14 billion years ago. An overarching Universal Principle is required to integrate and unify the many disparate theories, models and laws upon which our scientific and cultural paradigms are based.
It is postulated that evolution is possibly the paramount principle as defined at a much broader and deeper level than previously realised. It provides not only a framework for the biological processes based on the Darwinian model, but a major overarching paradigm to rationalise all processes and phenomena comprising the universe at large.
The author’s hypothesis proposes that life, the universe and everything is driven at the core by a far deeper, more complex and enigmatic manifestation of the evolutionary process based on the principles that follow.
It is further proposed that these principles form a comprehensive basis for a formal proof of a complete information based unified theory of evolution. The theory defined in this book is the Decision-Network Evolutionary Theory or D-Net.
Principle 1- Evolution is an adaptive decision/information-based convergent process that occurs at all levels and for all forms of process complexity- physical, biological and social. Adaptation is a common language for both living and non-living systems.
At the core, it involves reducing the information differential between a system and its environment.
Principle 2- Life as we know it is an outcome of the evolutionary process because it continually reinforces the capacity for more complex information processing.
As information is continually generated by life and its support systems, it inevitably leads to more complex life and therefore more efficient information processing systems. Life therefore, as an outcome of this complexity gain, represents a continually evolving class of efficient information processors.
Principle 3- Evolution is an adaptive process that seeks to minimise the functional differential between a system and its environment by increasing its information processing capacity. This hypothesis is supported by the recently discovered genetic process of ‘bet hedging’ in bacteria, as well as adaptive learning processes in all living systems. In addition, the Principle of Least Action of physics as adapted by Roy Frieden to extract the laws of nature from information measurement, provides a powerful supporting mathematical framework for this theory.
Principle 4- Evolutionary based system transformations can be expressed in terms of information, network and decision theory and can be modelled by causal information/decision networks. In this model the network’s edges represent channels for information flows and network nodes the decision processes for selection of the most efficient information transformations governing system/environment interactions. Additionally, decision states or outcomes from each node may be modelled as quantum states in a Hilbert space.
Principle 5- Decision/network processes are an integral part of evolution, involving the selection of information that can manage the adaptation or information differential minimisation process. This in turn leads to the accrual of more complex decision capacity and value-based outcomes for the system. In other words, it represents a positive feedback loop that is increasingly beneficial to the system over time. It facilitates selection of the most appropriate information via decision probability outcomes. Information selection and injection using decision networks therefore reduces entropy within the system and increases complexity. In turn this increases the capacity for survival of life by eventually allowing an increasing amount of information to be processed in subjective time.
Principle 6- The decision network theory is based on a highly abstract model of quantum information processing, including the laws governing matter and forces. It therefore has much in common with the latest quantum gravity model of spacetime- Loop Quantum Gravity. As described in the previous chapter, this model maps spacetime and physical processes via the medium of networks representing quantised volumes, areas and in the latest model-qubits. Fundamental particles such as quarks and electrons are not primary in the LQG model but secondary outcomes of causal information flows generated and guided by such braided networks of quantized spacetime.
Principle 7- Evolution is therefore the basis of adaptation and increase in complexity, which is achieved through a process of directed information and energy flows managed by the system’s self-organising processes. Self-organization acts by continuously transforming the system’s structure and topology, minimising energy and resource usage to create the most efficient and effective evolutionary outcomes.
The system seeks to maximise its information processing and energy efficiency and minimise its energy costs in relation to its environment. This is represented in physical systems by the Principle of Least Action, as previously outlined. However, unlike competing theories of evolution, energy processing is secondary to the key requirement of optimal information management.
Principle 8 – Applied to living systems, evolution therefore results in an increase in organisational and computational complexity over time, which then results in an increase in the system’s environmental complexity through a process of reciprocal adaptation. Complex systems tend to be more stable as well as more adaptive than simpler ones. This provides improved selective advantage and ensures the system transforms over time to one of greater complexity.
Principle 9- the inevitable outcome of this positive reinforcement in life’s complexity and optimisation as a sophisticated knowledge processor, is the emergence of a seamlessly cooperative network or entity that eventually takes on the more abstract form of a global/universal consciousness- Omega, as described in the final chapter.
Life’s capacity to continually expand and improve its abilities as an extremely efficient processor of information, using the evolutionary process, is now provable as a scientific hypothesis. Since life emerged on this planet almost 4 billion years ago it has continued to increase its processing capability through the development of more flexible and adaptable cellular and neural structures as well as adaptive body forms. In addition, all life from multi-cellular organisms to human societies has learnt the value of cooperation. Humans, the most advanced information processors on earth, now utilise a variety of sophisticated mathematical and computational techniques including artificial intelligence, cellular automata, quantum computing, neural networks, evolutionary algorithms and biological computing, as well as vastly amplifying its problem solving capability through the internet’s massive computational intelligence.
This trajectory will continue at an exponential rate.
Systems, network, information and quantum theory therefore provides the theoretical basis for the unified evolutionary theory as postulated in this book. Some of the most critical aspects are now discussed.
of
It is postulated that evolution is possibly the paramount principle as defined at a much broader and deeper level than previously realised. It provides not only a framework for the biological processes based on the Darwinian model, but a major overarching paradigm to rationalise all processes and phenomena comprising the universe at large.
The author’s hypothesis proposes that life, the universe and everything is driven at the core by a far deeper, more complex and enigmatic manifestation of the evolutionary process based on the principles that follow.
It is further proposed that these principles form a comprehensive basis for a formal proof of a complete information based unified theory of evolution. The theory defined in this book is the Decision-Network Evolutionary Theory or D-Net.
Principle 1- Evolution is an adaptive decision/information-based convergent process that occurs at all levels and for all forms of process complexity- physical, biological and social. Adaptation is a common language for both living and non-living systems.
At the core, it involves reducing the information differential between a system and its environment.
Principle 2- Life as we know it is an outcome of the evolutionary process because it continually reinforces the capacity for more complex information processing.
As information is continually generated by life and its support systems, it inevitably leads to more complex life and therefore more efficient information processing systems. Life therefore, as an outcome of this complexity gain, represents a continually evolving class of efficient information processors.
Principle 3- Evolution is an adaptive process that seeks to minimise the functional differential between a system and its environment by increasing its information processing capacity. This hypothesis is supported by the recently discovered genetic process of ‘bet hedging’ in bacteria, as well as adaptive learning processes in all living systems. In addition, the Principle of Least Action of physics as adapted by Roy Frieden to extract the laws of nature from information measurement, provides a powerful supporting mathematical framework for this theory.
Principle 4- Evolutionary based system transformations can be expressed in terms of information, network and decision theory and can be modelled by causal information/decision networks. In this model the network’s edges represent channels for information flows and network nodes the decision processes for selection of the most efficient information transformations governing system/environment interactions. Additionally, decision states or outcomes from each node may be modelled as quantum states in a Hilbert space.
Principle 5- Decision/network processes are an integral part of evolution, involving the selection of information that can manage the adaptation or information differential minimisation process. This in turn leads to the accrual of more complex decision capacity and value-based outcomes for the system. In other words, it represents a positive feedback loop that is increasingly beneficial to the system over time. It facilitates selection of the most appropriate information via decision probability outcomes. Information selection and injection using decision networks therefore reduces entropy within the system and increases complexity. In turn this increases the capacity for survival of life by eventually allowing an increasing amount of information to be processed in subjective time.
Principle 6- The decision network theory is based on a highly abstract model of quantum information processing, including the laws governing matter and forces. It therefore has much in common with the latest quantum gravity model of spacetime- Loop Quantum Gravity. As described in the previous chapter, this model maps spacetime and physical processes via the medium of networks representing quantised volumes, areas and in the latest model-qubits. Fundamental particles such as quarks and electrons are not primary in the LQG model but secondary outcomes of causal information flows generated and guided by such braided networks of quantized spacetime.
Principle 7- Evolution is therefore the basis of adaptation and increase in complexity, which is achieved through a process of directed information and energy flows managed by the system’s self-organising processes. Self-organization acts by continuously transforming the system’s structure and topology, minimising energy and resource usage to create the most efficient and effective evolutionary outcomes.
The system seeks to maximise its information processing and energy efficiency and minimise its energy costs in relation to its environment. This is represented in physical systems by the Principle of Least Action, as previously outlined. However, unlike competing theories of evolution, energy processing is secondary to the key requirement of optimal information management.
Principle 8 – Applied to living systems, evolution therefore results in an increase in organisational and computational complexity over time, which then results in an increase in the system’s environmental complexity through a process of reciprocal adaptation. Complex systems tend to be more stable as well as more adaptive than simpler ones. This provides improved selective advantage and ensures the system transforms over time to one of greater complexity.
Principle 9- the inevitable outcome of this positive reinforcement in life’s complexity and optimisation as a sophisticated knowledge processor, is the emergence of a seamlessly cooperative network or entity that eventually takes on the more abstract form of a global/universal consciousness- Omega, as described in the final chapter.
Life’s capacity to continually expand and improve its abilities as an extremely efficient processor of information, using the evolutionary process, is now provable as a scientific hypothesis. Since life emerged on this planet almost 4 billion years ago it has continued to increase its processing capability through the development of more flexible and adaptable cellular and neural structures as well as adaptive body forms. In addition, all life from multi-cellular organisms to human societies has learnt the value of cooperation. Humans, the most advanced information processors on earth, now utilise a variety of sophisticated mathematical and computational techniques including artificial intelligence, cellular automata, quantum computing, neural networks, evolutionary algorithms and biological computing, as well as vastly amplifying its problem solving capability through the internet’s massive computational intelligence.
This trajectory will continue at an exponential rate.
Systems, network, information and quantum theory therefore provides the theoretical basis for the unified evolutionary theory as postulated in this book. Some of the most critical aspects are now discussed.
of
Subscribe to:
Posts (Atom)