uwot wrote: Obvious Leo wrote:
My process model is exclusively an information theory since I literally model the universe as a non-linear computation. Therefore I regard information and energy as entirely synonymous constructs.
Fine. But what is it and how does it work?
Energy is energy is information. The whole idea of irreducibility means that it admits of no further definition but the irreducible component of a computation model is that the irreducible unit is a time interval, since a computation is a solely temporal construct. In my model I call this irreducible time interval a monad and model these monads as mathematical points on a continuously emerging and self-generating wave. The monad has only two physical properties, these being its information/energy content and the duration of its existence in this state. Thus we say that each monad in the continuously emerging continuum simple BECOMES its own next monad in a different informational state and the new informational state is causally determined by the behaviour of all the other monads in the continuum, this influence being propagated at the speed of light. Thus the speed of light is simply the processing speed of the cosmic computation, or the speed of causality at the Planck scale. However what drives the self-causal mechanism towards the lower entropy state is the asymmetrical relationship between gravity and time because information is physical and thus the duration of existence of each monad in any given state is determined by the amount of information it contains. This is a perfectly logical and mandated conclusion from GR in a spaceless universe. If this notion is translated into Conway's model it's not all that hard to visualise how such monads can encode for subatomic particles at an emergent level and how these particles then go on to encode for more complex matter is reasonably well known to physics. The important thing to understand is that at the Planck scale this process is entirely self-determining solely because of gravity and it is only at the emergent level where physical "properties" become apparent. In other words it is not the "laws of physics" which make reality but reality which determines the emergent order which we define in terms of the "laws of physics".
uwot wrote: If you can show me that your model isn't undetermined,
This model is completely deterministic but non-Newtonian. The cosmos is a computer without a programme which quite literally makes it up as it goes along. Such self-determining non-linear process ALWAYS lead to a decrease in the total entropy of the system whereas in the Newton/Laplace model of determinism the gigantic unwinding clock mandates for a continuous increase in the overall entropy. This completely contradicts the evidence because the big bang was HOT and the universe has been cooling down and self-creating increasingly more complex informational substructures within itself ever since, including us.
uwot wrote:the "gravity/time continuum hypothesis yields a testable prediction",
No scientific hypothesis can ever be proven true but a dodgy one must always be falsifiable. I have outlined the protocols for an experiment which falsifies the spacetime hypothesis in my synopsis but as far as I'm concerned this was already done by Michelson-Morley and by Einstein, Podolsky and Rosen. The EPR paradox can only be explained if the Cartesian space is an observer effect and so too can every other paradox and metaphysical absurdity in physics. The subatomic world starts to look more like a Bohmian one when the behaviour of subatomic particles is understood as being chaotic rather than random and there is a perfect synergy with GR as well because we already that the behaviour of all cosmological bodies is chaotic rather than random. Physics is merely conflating non-predictability with undetermined but any weather forecaster knows that this is bullshit.
uwot wrote:
Well, if time is a measure of change, then it goes without saying. However, if you partition reality into packets, even really tiny ones, you necessarily introduce spooky action at a distance, albeit a teeny-weeny distance,
You're not paying attention, uwot. Distance is a purely epistemic notion in this model and Bell's theorem looks exactly like a statement of the bloody obvious. Our notions of locality and non-locality are products of our own minds because we are applying a spatio-temporal extension to what are purely temporal phenomena. REALITY IS NOT A PLACE.
uwot wrote:Fine and dandy to a mathematician, they don't need to know what causes gravity,
Nothing causes gravity. If time and gravity are simply two different ways of expressing the same thing then this inversely logarithmic relationship makes gravity/time the causER and not the causEE. The entire notion of regarding gravity as a "force" is completely wrong-headed because the gravity/time relationship becomes the sole self-organising principle of the universe. At the Planck scale nothing else is needed.
Greta wrote:
Certainly it appears that phenomena don't always operate linearly, but why? When proportionality of input and output is skewed, what's going on?
Non-linear dynamic systems theory is a mature science with a rigorous methodology and it is used to model every naturally occurring system except for those in physics, which seems to be trapped in a Newtonian time-warp. That physical reality evolves from the simple to the complex is a law of nature as fundamental as 1+1=2 and why it should do this is perfectly well understood. In a non-Newtonian world all determinism is non-linear and chaotically determined systems evolve from the simple to the complex purely because they cannot do otherwise. However this makes the problem of physics not only metaphysical but also meta-mathematical because Newton's classical mathematics cannot model such a self-determining reality. Einstein eventually came to this conclusion not long before he died but Henri Poincare had already pointed this out half a century earlier when he emphatically rejected the Minkowski model as bollocks. He was working on the three-body problem which had been around since Newton and was trying to develop the new mathematical tools which would be needed to solve it. Unfortunately he died before he got very far with it but it was Poincare who laid the groundwork for the future development of fractal geometry, the mathematical system which underpins all of complexity theory. It will also underpin the new physics when the geeks get their heads out of their arses and understand that the universe is a PROCESS.
Greta wrote:
Energy is the material and information the design of the material. What of high energy, low order states like atomic bomb blasts as compared with low energy but highly ordered operations of computers? Leo, surely there's some fundamental difference - that energy and information cannot be boiled down to one or the other or be treated as synonymous, but they have more of a complementary relationship, like light and shadow.
I don't think you're getting the distinction between linear and non-linear determinism, Greta, and this distinction is central to what I'm banging on about. Newtonian physics is inescapably a Platonist or creationist paradigm where the behaviour of matter and energy is seen to be the product of transcendent cause. This model is both Leibnizian and Spinozan because it defines a universe of immanent cause which is sufficient to both its own existence as well as to the existence of every complex physical entity it contains, including life and mind.
attofishpi wrote:The chap that made the video in my OP
The bloke that made the video in your OP was on the ball, atto, and he made a very telling comment which accords perfectly with my own model. He said that the speed of light is the speed of causality and this is a very significant point because it ties in time, gravity and the rate of change in a physical system with the speed of light. It accords very nicely with my own definition of the speed of light as the processing speed of the cosmic computation and once the exact relationship between gravity and time is understood we can see that this makes the speed of light the most inconstant speed in the universe, being variable all the way down to the Planck scale. This is what brings our universe to life in all its splendour and complexity.
Arising_uk wrote:Greta wrote:
Certainly it appears that phenomena don't always operate linearly, but why? When proportionality of input and output is skewed, what's going on?
I'm not sure what you meant by 'operate linearly' but if it means non-sequentially I think nonlinear does not mean this in this instance but stand to be corrected in both instances.
Correct. Non-linear does NOT mean non-sequential. In chaotically determined systems effects are always preceded by causes in an orderly and generative fashion. However the order and complexity of such systems is self-generating rather than imposed by a programme from outwith the system itself, which of course is the assumption in physics.
uwot wrote:Greta wrote:Energy is the material and information the design of the material.
This, if you turn it on its head, is the gist of field theories, like the Higg's field. GR is basically a field theory. The idea is that there is something substantial, a field, which is basically rippled or contorted. It is these contortions, the design, if you like the information, that is the source of energy. It's a bit like a surfer has the energy to ride down a wave. If you hit water hard enough, you can create a whirlpool. In effect, that is what the Higg's boson is.
This bogus field ontology has been around for ever and nobody has ever succeeded in making much of it. Fields do not make reality. Reality makes itself but the way in which it does this is something which an astute observer can model in terms of fields, waves, forces, particles etc. None of these mathematical notions have any ontological status but they do attest to the remarkable ability of the human mind to linearise the non-linear.