Thursday, April 22, 2021

Information, Time, Energy

information_time_energy.rst.rst

Information is all. Variable, time, energy.

Summary

In the evolution blog I started from "there is no one time", rather every independent change is its own time. This already says that information is time. To say "time is all" or "information is all" are equivalent. Statements are always meant per variable.

The objective here and in the evolution blog is interpretation of physics without the details to produce mathematical consistency.

To say W = I ⁄ t is like saying that energy is time divided by time. W stands for work.

Variable/value:

Values of a physical variable are defined by time and define time for the variable. Every variable has its own information and its own time.

Cycle:

The finite number of values of a variable cycle until the variable ceases to exist. A variable has a curvature when expressed with its observables. Values exist only while occurring.

Information = time:
 

The information of a variable is the number of values in a cycle. I = Wdt = Wt. W is constant and representing I. I cycles over a space extent in a time period t. The variable as a value (quantum) has a space-time extent.

Momentum:p = ∂I ⁄ ∂x = mẋ.

Component of a value of a variable. Component of a time step.

Energy:W =  − H = ∂I ⁄ ∂t.

I is constant: dI ⁄ dt = ∂I ⁄ ∂x + ∂I ⁄ ∂t = 0. The great trick is to associate the observed changes (the values) to the location where they are observed (create a field). Then one can look at a part of the system (Lagrangian L) and see the rest as potential P(x), from which information flows in: L(x, ) = K() − P(x) is an information unit (time unit) with a space direction. J = Ldt counts change and ends with 0 after full cycle.

Force:F = ∂W ⁄ ∂x =  − 

Component of energy. Component of momentum change.

Levels:

A level consists of parallel variables (encapsulations of interactions or particles). The variables of the lower level become values in this level. Every level has its independent variables with own times.

Resolution:

Every level has its own information resolution. On the lowest physical level the information unit is h.

Mathematical summary:

I = τ 0 = dI ⁄ dt = (I)/(x)(dx)/(dt) + (I)/(t) ⇒ W(x, p) =  − H(x, p) = (I)/(t) W = (I)/(t) = (τ)/(t) = f W = (τ)/(x)(x)/(t) ⇒ W(x)/(τ) = (x)/(t) ⇒ misW H = pẋ − L = K + PHamiltonian:(dL)/(dt) = (d)/(dt)(L)/() ⇒ 0 = (d)/(dt)(L)/() − L L = pẋ − H = K − P[Lagrangian: phase of component, while constant H is for full system) δJ = δx(L)/(x) + δ(L)/()dt = δx(L)/(x) − (d)/(dt)(L)/()dt (δJ)/(δx) = 0 ⇒ (L)/(x) = (d)/(dt)(L)/() ⇒ F =  F = (L)/(x) = (W)/(x) =  − (H)/(x) p = (τ)/(x) = (L)/() = (W)/() = mẋ W = (τ)/(x)(x)/(t) = mẋ² P = (W)/(x)dx = Fdx = ṗdx = m(dẋ)/(dt)dx = mẋdẋ = mẋ² ⁄ 2 = K W = P(x) + K() = mẋ²

Information-Time-Energy

Information

When we think of information we think of a language that conveys information. But first there need to be the alternatives the words of a language select from.

Our language consists of words. The word selects one of the concepts in the mind of the person we talk to. The concepts in the mind refer to real things. Tree, stone, house, ...

Dynamic systems evolve by alternatives and selection (mutation and selection). Think of the biological evolution, mind, science, economy, ... There is a creative phase and a selective phase. In human contexts they can also be called "search and find" or "trial and error" or "learn and control". But we can also point the two words to the same thing. The selection itself brings the thing into existence in the first place. Physical processes can be seen as such.

Many physical systems have no memory, but they have information. And, although we have a memory of concepts, they only become conscious those times we think of them.

But the mathematical set has operations like union and intersection. They are more complicated and can be decomposed into individual selections. The  ∈  of a set selects one element from a set. This set with only  ∈  is more fundamental. It is the ubiquitous variable. How could it be otherwise. Something so fundamental must be ubiquitous.

The variable is the foundation of mathematics, and more general the foundation of all dynamic systems.

The variable consists of values. Other words for values are alternatives or states.

A variable consists of all the values

  • that occur (exhaustiveness) and
  • that exclude each other (exclusiveness)

Information is the number of values of a variable.

Time

The values of the variable occur and excludes each other. In casual English one would say "a value at a time". That is exactly what physical time is.

A time step is the selection or change of value of a variable. In between selections that variable has no time.

Every variable has its own time.

A time step is a value of the variable.

Components of values ( = coordinates = observables), that change at the same time, physically constitute only one value.

Energy of a Variable

The values of a variable can occur slowly or fast. But that can only be seen, if we have another variable to compare to. Our mind has an internal clock that gives a comparison.

When we run a film in slow or fast motion, we get an idea that the time during film shooting can be seen as either slow or fast, depending on our reference time.

In other words: energy is time compared to time. The first "time" we call information or proper time.

W = (ΔI)/(Δt) = (Δτ)/(Δt)

The comparison W = ΔI ⁄ Δt is with an unrelated other variable t. This happens only in the mind. It is not physical. The other t is an arbitrarily chosen unit of information and W is the number value of information (I = Et). One can also do it in reverse, then 1 ⁄ t (Hertz, Hz) is the unit of W and I is the number value.

Energy is the rate of selection, or information rate or frequency.

In nature many variables are isolated. With just one variable the only time its own, and W = ∂I ⁄ ∂I = 1. The concept of information demands values, changes, time on its own. I itself has a proper time.

Do we need to choose another variable to have time? No. Our variable changes its values and that is the proper time of the variable. If values do not change, then there is no time and so the variable does not get into existence.

The variable thus defines:

  • information
  • and time

Information implies time and time implies information. Time and information are equivalent, physically. With just one variable information and time are synonyms and energy is just 1 or has no meaning.

When comparing to another variable, information implies energy and energy implies information. The comparison is often just a thing of our mind. The physical motivation for energy comes, when the selections of one variable matters with respect to the selections of another variable. The other variable is called time to distinguish it from the first variable, but that time is still information.

A physical change is looked at by arbitrary coordinates of the mind. More dx1, dx2, ... can constitute one physical change dI.

The value dI of a physical variable is an interaction between observables. An interaction is one time step with possibly more participants/observables.

The xi take part in an interaction:

W = (I)/(t) =  − H =  − (I)/(xi)(dxi)/(dt) =  − pii

Cycle

As long as a variable exists a variable cycles through its values with constant rate W.

How much external time a cycle takes, depends on

  • the number of values (information)
  • and the rate of selection (energy)
t = (I)/(W)

Selections order the values. If the last value is reached, the selection continues with the first. This brings the first value near the last. How can you do that with one variable alone? We need to distinguish between mind variables and physical variables. Mind variables vary separately only in the mind, while physically they are only a component of a change. One physical value has more observable components, which do not vary independently and are thus not physical variables.

In two dimension you can create a circle, in which the last value is close to the first one. And indeed, nature has examples of values with two observables, think of the electric and magnetic fields in electricity or the elevation and velocity of a pendulum.

What is meant by a variable here is defined by a change, i.e. by a time step. Within one Δt changes of at least two observables combine to one.

All the value combinations of two or more observables together form a physical and inherently cyclic variable.

Levels

The physical world is layered. A level is defined by its variables and interactions.

Every level has

  • an information quantum, which implicitly defines
  • a typical information rate (energy)

On the physically lowest level, it is the Planck constant h:

EΔt = ΔI = h

Δt and ΔI are both information of some distinct variables. On the lowest level we have Δt = ΔI = h and thus W = 1.

ΔI also defines an acceptable deviation for a level. This is a generalization of energy levels of an electron in atom orbitals, and would be called information levels here.

One level builds on top of the other. All the cumulative changes through the levels are limited by the rate on the lowest level. Higher level changes are slower, because they involve lower level changes over more layers.

If a higher level changes faster, then the levels below need to get slower, because h itself does not change.

In higher levels the quantum of information can be quite large. One can still choose a natural unit of information for a level, like h for the lowest level.

Speed vs rate of information

W = I ⁄ t compares time with time. There is no physical space involved. So information cannot be attributed a speed in physical space.

The physical space is contained in the definition of a specific I by their simultaneous changes with the components of I along them (momenta).

If I say a word, the travel time of the word to my interlocutor and its interpretation to a concept, is one value in our interaction, in our communication. The changes (observables) of the communication partners to form and interpret the message is shared between the partners. This idea applies also to lower level physical interactions.

On the lowest level the signal speed is that of the speed of light c. On higher levels it is a lot slower (but could be called the c of the level).

c compares to an external time t already. ct removes that external time. This gives the proper time unit dI a space extent.

A value does not move from place to place, but it has a space extent, a space quantum. The components of a value are quantized. In the lowest level this is ΔpΔx = h.

The next value can occupy a different space close by. With hν photons that space is λ = c ⁄ ν away (hν = hc ⁄ λ), t = 1 ⁄ ν later (Et = hνt = h).

Higher energies cycle locally, which binds some h in a mass m. m encapsulates all the lower levels. W² = p²c² + m²c. If p = 0, all the energy is within m.

The p = mẋ = h ⁄ λ attribution of smaller λ to higher momenta are due to the many parallel lower level particles averaging and producing a space precision that is not there in the single particle.

Particle

If a variable itself is closed and it takes part in a higher level interaction, then it forms a value component of the higher level variable. The value component's internal information has no information in the higher level variable, but it has internal information.

A physical variable that acts as a value in a level is sometimes called a particle. A particle is a synonym to a variable used to distinguish in specific contexts.

Particles are information quanta. Particles have or are a time extent and also have a physical space extent. So the particle is a space-time quantum.

A variable level builds on a particle level. In an interaction between people, the person is the particle. Looking at a person's thinking as a variable, concepts are the particles.

The interactions in higher levels take a longer time and involve more physical space. But the information needs to cycle during such long times, to conserve its information. There are cycling encapsulations all the way down to the lowest level, which cycles with h. Interaction in higher levels are via particles in lower levels.

A particle has its internal interactions, its internal time. Mass is another name for energy, meaning the inner cycling of a particle.

Static vs dynamic Information

Our mind/brain has its own time. We often neglects the physical time implied by a physical variable and use our brain time on the values instead, but that brain time is a different time than the time of the observed variable itself. Mind variables are also physical, because the mind is physical, but when mapping from reality, the time is replaced by that of the brain.

In mathematics the same logic can be followed by different brains, i.e. different times, different time durations. Mathematics considers variables without time, but to actually exist all these variables need to be thought, i.e. time needs to be added. Mathematics often abstracts away how the values came into existence, and that they came into existence at different times.

In a variable without time we only have the count of values (static variable). One can make the count of values using combinations of values of other static variables. We use the variable of 0, .., 9 (digit) to count or the a, .., z (alphabet) to address concepts. One could also use digits for addressing. Computers use the bit 0, 1, because that is the smallest variable one can still choose from. Since the bit is smaller than the digit, the word length is larger (100000000 vs 256).

The number of bits/digits/alphabets needed to produce the combinations I:

S = logI

S counts the unit variables to produce a value combination count. The unit variable itself counts as 1. This look is that on a level, where the variable is a value.

Between levels, when including a lower level, e.g. because the lower level matters, we transition from addition to multiplication. In the other direction, we transition from multiplication to addition, i.e. we use the logarithm.

In thermodynamic systems we have two levels. The upper level does not distinguish between all the 2S combinations of values from the lower level.

Entropy is the upper level part in a two level system. Entropy is the count of independent variables, the molecules, whose values (timing) are compared independently (lower level energy).

The lower level part is the temperature T, which is the average energy of a molecule.

The interactions between upper level compartments would be to exchange molecules, i.e. entropy S.

Lower level temperature interaction (heat equation) is quite similar to the quantum mechanical Schrödinger equation. Both compare the time of one level with a two level process. Change and thus time happening in the upper level is due to different information rate in the lower level.

For comparison, the wave equation compares the two times of two levels (not one time) with the space components of two levels.

Temperature T is the average energy per molecule. Similarly pressure p is the average energy per volume (energy packet = particle) and V is a higher level variable that counts the lower level packets.

ST = pV

A higher level equilibrium means no time in the higher level, but it corresponds to a maximum number of lower level interactions. All the exchange of information (W) is in the lower level and at equal rate in both direction.

If more exchange were in the higher level, the lower level would have less.

The molecules have still further lower levels and they exchange information, too: via electromagnetic radiation. If the temperature increases the molecule velocity increases. Velocity alone has no energy, because it is a value, but in collisions a higher velocity means more steps to reach 0 or v. Molecule velocity change is in the thermal level. Because there is more change in the thermal level, the atom's orbital timing needs to decrease or increase, which produces discrete photon emissions or absorptions in the orbitals, but experience a random Doppler shift due to relative thermal motion, which leads to the Planck law.

Energy as Information Flow

W = ∂I ⁄ ∂t can compare the whole variable to some unrelated variable t, in which case I stays constant and the rate W = I ⁄ t stays also constant But we can also look at a part of the system, and see W = ∂I ⁄ ∂t as inflow or outflow of information (see Lagrangian below).

Adding or removing information to the system is a higher level time. If information is added to a variable, the variable becomes a different variable.

Information can exist only as cycling variables. So information is transported as information/energy packets (particles), for example as molecules of a certain chemical energy content. The molecular interactions use atoms as packets. The atomic nucleus uses nucleons as packets. Every layer has its own packets.

Every layer has its own energy, i.e. frequency of packet exchange. A variable serving as time to compare to is level-specific. Energy is expressed in a unit relevant for the level. Relevant, usable energy is level-specific.

If levels interact more levels need to be considered. If one level's frequency is called energy, then the next higher level's frequency is called power. For example, in electricity energy W = UQ is an energy of one level and power P = UI = fW is the energy of the next level.

The energy is important as a measure to express the relative rate of information exchange between systems. How fast an exchange is in comparison to the other, decides

  • where the accumulation of information happens
  • who survives how long

If we have only accumulation on one side, the joint system dies, when there is nothing to accumulate any more.

If there is a back and forth of accumulation, the joint system survives longer. One system is the potential energy for the other system and vice versa. The states of the joint system are the values of the system as a variable.

The exchange of information packets takes time, but that time is shared between the two systems. The time step thus makes both changes as one.

A variable is an information unit I. Comparing it to an external variable t cannot change the internal physics. Energy W = I ⁄ t is a property of the variable, not of the value, and especially not of the components of a value. To express energy as functions of values gives a wrong picture. It is an indirect mapping: value -> variable -> energy. The variable has one energy. All values just map to this energy, which is the same for all values.

Saying W = mẋ² ⁄ 2 + mgx describes the v, h observables sharing the same energy W, i.e. the energy of a variable and not of a value. Expressing W = mẋ² ⁄ 2 or W = mgx separately and as a function of values has no meaning.

The word energy is often used in the sense of information, as understood here. Here information implies time and thus also energy. That physically they are the same, is the major statement here.

Mathematics uses information in a static sense, although physically it exists only when processed by a brain. Also physics uses entropy S or mass m for static information, but according to the understanding here, this just neglects lower level dynamics (m) or this level's dynamics S. So, although not physical, to distinguish between energy and information makes sense as a tool to give a shorter description of a local context.

Interaction between Levels

"Information is time" means that information does not exist without processing. Higher level particles have more inner processing and are thus higher in low level energy. They are energy packets.

The high level interactions can be slow (low in level energy) compared to lower levels. A level has a more or less constant information rate. The parallel particles encapsulate more lower levels (animal, cell, molecule, atom, ...). Each level has information processing and thus stores energy.

Information flows between levels, too. For example, when two molecules react, they release energy to the thermodynamic level (Enthalpy ΔH). Lower level variables get destroyed to create higher level variables, i.e. higher level processing.

The Maxwell Demon (controller) works between levels. Many-level systems like living beings (microorganism, plants, animals, ...), but also companies or social structures in general use this principle of control.

The controller maps the higher level logic to a lower level, which processes faster. The lower level simulates the logic of the higher level. As the lower level is faster, it can pick high energy packets. Then the controller uses the inner energy of the high energy packet, to keep its own interactions (keep T) running or to reproduce (increase S) (change T or S in a generalized W = TS).

Higher energy packets demand for energy storage. Storage divided by consumption determines the rate of high level interaction. Higher level exchange rates are slower.

When the higher level changes its logic the controller must adapt (learn to control). Such changes are slower than the selection of the energy packets.

  • The DNA in living organisms is a mapping of the ecosystem. It changes with the ecosystem or gets extinct.
  • Emotions change with the availability of resources over generations.
  • Rational thinking adapts within the lifetime.

That he total information flow distributes to complex levels on earth is due to slow cooling (annealing) over a long time. If W goes down, ST can keep a constant T by reducing S at a level, e.g. by making larger molecules. Systems that encapsulate, live longer in the presence of cooling. To live longer means a smaller rate, i.e. less W. The same happens in the learning brain, to the economy, and other dynamic systems.

Newton

Newton (rephrased): An object rests or moves in a straight line with constant speed, unless there is an interaction (force) with another object and that force changes both objects (actio=reactio).

A straight line would imply an infinity. There is no infinity in the physical world. Real systems always cycle until they cease to exist as system. Every curved line seems straight with enough zooming. That is why in physics one always uses manifolds instead of the flat n. Newton's straight line needs to be replaced by a geodesic, whose curvature is that of the components of the cycling variable.

Information implies time. Time is force. There is always a force.

Normally one looks at objects that are obviously interacting. They have a time. An isolated object does not exist. If you found one, it already interacted with you. If that interaction does not explain its behavior, you need to search for other objects it interacts with.

With the actio=reactio, it is implied that the two object's changes are observables of one change, and thus constitute one time step. The force is shared between the interacting objects.

Velocity cannot be seen or measured physically from inside its own inertial frame, so it has no information. And so it has no information from outside, neither. Velocity is a component of a value and not a physical variable. A value does have no information. Only a variable has information, and thus exists. Not even a change of velocity, as seen from the flat space of our mind, does exist, because it is a value and not a variable. And indeed within a geodesic the acceleration cannot be measured.

In the following, assuming τ ⁄ ∂xdx ⁄ dt constant, demands that τ ⁄ ∂x and dx ⁄ dt change in opposite directions (the minus sign). Mathematically speaking that is partial integration with vanishing integral, which is done below with the Lagrangian. Here we demand constant energy, while in the Lagrangian method constant energy follows from stationary action/information. But energy stands for information, too (W = I ⁄ t).

0  = (dW)/(dt) = (d)/(dt)(τ)/(x)(dx)/(dt) =   =  − (d)/(dt)(τ)/(x)(dx)/(dt) + (τ)/(x)(d²x)/(dt²) =   =  − (W)/(x)(dx)/(dt) + m(dx)/(dt)(d²x)/(dt²) [()/(t)(τ)/(x) = ((τ)/(t))/(x) = (W)/(x)]

Divide by dx ⁄ dt to get Newton's force law:

F(W)/(x) = ma

The part in brackets is a definition of force.

To get to Newton's formula an unexplained step was used:

p = (τ)/(x) = m(x)/(t)

This is thus a consequence of Newton's force law. p = mẋ is by observation. Then it is assigned to (τ)/(x) by definition. If we do that definition, than W = mẋ² further down. The physics behind that is that and x are independent components of a time step: Δτ = mẋΔx.

That a value has components solves the vis-viva debate that was going on between Newton (mẋ), Leibniz (mẋ² ⁄ 2) and others. τ ⁄ ∂x is a component of a change, i.e. of a time or information step. Time is an interaction with more partners. This leads to the concept of energy:

Also mass turns out to be a kind of energy:

W = (τ)/(t) =  − H =  − (τ)/(x)(x)/(t) (x)/(t) = H(x)/(τ) = m(x)/(τ)

Comparisons to t are not physical, but a necessity of the mind. By comparing more observable changes to one external time t, one can relate changes and create a topology and a metric on it for a specific system.

W = (τ)/(x)(x)/(t) = m(x)/(t)(x)/(t) = mẋ²

W is the full energy. m summarizes lower level energies. With c as maximum v, there are no lower level changes possible any more, and thus mass is exactly the movement itself: m = W ⁄ c².

Mind vs Reality

Our mind is a physical system itself, and has its own time. Actually there are independent parallel processes in the mind, which have separate times. But they are compared, which creates one time and the feeling of conciousness.

A variable is defined by its values. The number of values is the information I of the variable. dI is the system change and thus the system time.

"Space" means generally the value-components of an interaction (a value), not necessarily physical space.

Values do only exist in conjunction with the variable, which exists, because it has information and time. The space values exist only when actually happening. This also applies to physical space. Our memory of physical space, for example when moving the hand through the air, is not the physical space itself.

A change can have more components within the same time. The components are mind variables, also called observables or coordinates. The mind can change them independently, i.e. give them their own time, but the physical system may not.

Comparing independent variables, results in these quantities:

  • energy W = ∂I ⁄ ∂t compares times of two variables
  • momentum p = ∂I ⁄ ∂x compares time with a component

Independent variables have separate times. Independent variables can exist in parallel, at the same higher level time, or sequentially.

Entropy S = logI counts parallel variables of same kind, whose actions sum up physically. S = logI is also the word length, of a language to address values of a larger variable.

A variable exists as long as its values cycle. Since the values are cyclic, there need to be at least two components to connect the last value with the first. p, x are such two conjugate components. They are called phase space to express in which phase the cycle of the variable is in. A dI time step corresponds to a I ⁄ ∂xdx = pdx in the phase space.

x is a mind variable, where we can spend a lot of time looping to arbitrary precision, but the physical dI is limited by the Planck constant h. h is the smallest, lowest level, unit of counting, i.e. the smallest time unit of nature. Nature is layered, though, and every level further up has a larger time unit.

I = ψ. In the Schrödinger and the Dirac equation, it is compared to another external time t: (ψ)/(t) Time is information.

The physical world is imprecise and finite. How to describe finite systems with our infinite variables of the mind? This is done by convolution ψ*ψdxdt =  < ψ|ψ > .

Any x, t our mind has finite precision. To any x, t of our mind the physical world, still and also, has imprecision. ψ summarizes both imprecisions. ψ counts how many alternative by chance states there are for a given x, t.

That a complex probability amplitude is used for ψ, allows to map whatever physical variables to two cycling meta-variables. Multiplying (convolving) with the conjugate  < ψ|ψ >  finally projects the cycle onto the direction of the observer.

The evolution in time t of I = ψ is the energy W (Schrödinger equation):

(ψ)/(t) = Eψ

i because of the differentiation and ħ = (h)/(2π) due to hν = ħω.

Since I = ψ implies time, the left side is time by time. The right side is what components constitutes one time step.

In the Dirac equation the ψ has four components corresponding to the same time.

The cycling produces spin. For a photon it is the cycling between electric field E and magnetic field B and it can be mapped to xi, t ( = xμ) via the Maxwell equations. The E, B-cycling correspondence to one xμ-rotation makes the photon a spin 1 particle.

If more variables are involved one cycling corresponds to more or less physical space rotations. For the Dirac ψ, one space rotation is only half of the cycle: Fermions have spin 1 ⁄ 2.

Lagrangian and Hamiltonian

The information I of a variable is its number of values. The values constitute time steps. The components of all the values form a curved space that allows the variable to cycle. Comparison to an external time, W = ∂I ⁄ ∂t, does not change I. W is a constant of motion.

Energy by itself is kinetic (K), because it is about time steps, i.e. about changes, but one can usually not consider all parts. Therefore one summarizes the remaining parts in potential energy (P) and associates it with the location of the observed part.

The Lagrangian L looks at a possibly small part of the system and measures the information flow per time from the potential part to the kinetic (.i.e. observed) part.

L(x, ) = K − P = pẋ − H(x, p())

pẋ − H(x, p()) is a Legendre transformation.

pẋ = mẋ2 is the full energy. Splitting off the non-observable part of the system half-half, makes K() = mẋ² ⁄ 2.

L constitutes a time step in the interaction between the two systems, while H represents the information itself. H =  − W = I ⁄ t.

H(x, ) = K + P

pẋ in L = pẋ − H varies over time. So L(x, ) oscillates around 0. L expresses the phase of the observed components. The sum over a cycle becomes minimal, because the information exchange cancels over one cycle.

Calculus of variations produces a condition that needs to be satisfied to have stationary information, i.e. constant information.

W =  − H stays constant. Wdt would count system time to infinity. L(x) = mẋ² + W = mẋ − H oscillates and returns to the same value in a cycle. J = Ldt returns to the same value after one or many cycles. This is bounded and can be minimized.

With a stationary J = Ldt one gets the Euler-Lagrange equation or the Hamilton Equation (further down). This corresponds to demanding that H =  − W is constant, as was done above (Newton).

In general, equations of motion (eom) produce the proper time and information I of the system (on-shell). Else we would count more than what actually constitutes one time step (off-shell).

δJ  = δx(L)/(x) + δ(L)/()dt partial integration of second part  = δx(L)/(x) − (d)/(dt)(L)/()dt (δJ)/(δx) = (L)/(x) − (d)/(dt)(L)/() = 0

Stationary condition δJ ⁄ δx = 0:

(L)/(x) − (d)/(dt)(L)/() = 0

By replacing L = p and F = ∂xL, this is Newtons F =  = ma. Note, F = ∂L ⁄ ∂x and p = ∂L ⁄ ∂ = ∂τ ⁄ ∂x are by definition.

  • We need to add a physical p and F separately to find L (Newton approach): e.g. p = mẋ or F = GmM ⁄ r2
  • Or we need to add a physical L to get p and F (Lagrange approach): e.g. L = mẋ² ⁄ 2 − GmM ⁄ r = mṙ² ⁄ 2 + mr²φ̇² ⁄ 2 − GmM ⁄ r.

One cannot derive Newton's laws from the minimization of the action J. One cannot derive physics. One needs to observe.

L(x, ) is transformed to H(p, x) via a Legendre transformation. H(p, x) considers the system as a whole, rather than the inflow or outflow of information from a component, as with L.

(dL)/(dt) = (d)/(dt)(L)/() + (L)/(t) 0 = (d)/(dt)(L)/() − L + (L)/(t) H(x, p) = pẋ − L(x, (p, x))

I ⁄ ∂t =  − H is the Hamilton-Jacobi equation. Information I is Hamilton's principal function.

Interactions have a constant rate unless the exchanged energy packets become of higher value. The cycling values of a variable from this level form the energy packet of the next higher level.

I = dI = Wdt is the full count of values, i.e. the full information of the system. W =  − H = ∂I ⁄ ∂t compares the system time steps dI to some other system's time t.

The Euler-Lagrange equation become the Hamilton equations.

(H)/(x) =  − (dp)/(dt) (H)/(p) = (dx)/(dt)

F =  − ∂xH is the reason for Fds = W.

Without the external dt in the Hamilton equations, we have:

(I)/(x) = Δp (I)/(p) = Δx

Or, integrating either of the two:

ΔI = ΔpΔx

Each dI change is represented by a phase space volume element ΔpΔx.

The information resolution of the physical world has a lower limit h.

ΔI = ΔpΔx ≥ h

Quantum mechanics realized that the ΔpΔx step is the time step: ψ ⁄ ∂t = ∂²ψ ⁄ (∂px). On the left side we have t where on the right side we have px. The ψ is our information I.

The Schrödinger equation, and thus ΔI = ΔpΔx, is just an example. The Dirac equation has more observables falling into the same system time step.

The constant I is the information of the observed system. I represents a variable, which cycles, forming the geodesic of the curved component space of the system.

In the higher level I may be just one value, one time step, which by itself has no information and cannot be described.

Taking away values from the trajectory, e.g. reducing the radius in the hydrogen atom's electron orbital, creates a separate variable (a photon), i.e. a separate information packet to keep the total information constant.

Values do not Commute

dI is a time step and H compares it to some external time step dt. H(x, q) is the same for every time step, i.e. every component combination or point in the phase space. H corresponds to the h in the lowest level.

Hdt = dI = (I)/(p)dp + (I)/(x)dx = xdp − pdx ≥ h ΔxΔp ≥ ħ ⁄ 2

So the values of a variable do not commute, because they are one time, one causal chain, serial.

The values of different variables do commute, because they are independent, parallel, without correlation. If one would sum over some external time stretch T and divide by T, one would get 0: 1 ⁄ T(dτ ⁄ dt)dt → 0. dτ would count any possible combined change of independent variables. dt would be of arbitrary size and would make T = dt arbitrarily large, since there is no cycling.

Topology

A variable implies time, which implies processing. Two variables have two time, i.e. parallel processing. A variable by itself is sequential, i.e. causal, meaning the values of a variable form a sequence.

Serialization of variables, makes the variable to a value of a higher level variable.

Higher level variables are a mix of serial and parallel processing of lower level variables of varying size. The more or less independent times of the variables, i.e. the information encapsulations, account for all the topologies of our universe.

All the topology is constructed by parallel vs serial in levels starting from the elementary h.

Since information is time already, the universe evolves via information alone.

The timing of a higher level variable is the result of the topology of variables it builds upon. On the lowest level the rate is constant and given by h. All serial interactions summed over the layers cannot exceed h.

Fast higher level interactions slow down the lower level interactions. For example,

  • in high gravity lower level clocks tick slower or
  • if S changes fast in a thermodynamic ST we cannot reach equilibrium, which keeps T based exchange slow
  • if humans interact a lot the thinking in the mind becomes slower due to the distraction

W is has an intrinsic uncertainty that defines a level.

With fixed W, large variables (with many values) cycle slowly. A higher level variable can become faster by making the lower variables of smaller size (W² = p²c² + m²c) or parallel.

Within a level the interactions (W) are highest if lower variables are of same W, i.e. synchronized.

If interacting parallel variables do not cycle with same W, there is a distribution of information until in equilibrium. The distribution of information is also called entropy maximization.

More parallel variables increase the information throughput. Energy in higher levels thus compares the degree of parallelization. This is a generalization of the thermodynamic W = ST.

The inertia (mass) of a larger system is due to the time needed to change or synchronize lower variables. It takes information flow and that takes time.

Quantum Units

For W = I ⁄ t, all the variables that can work as external t are information, too. The lowest quantum h is therefore also a time quantum.

Δtmin = h

c = dx ⁄ dt with constant c and minimum time h, makes the minimum space quantum to

xmin = ch

Energy compares two times and its minimum is thus W = h ⁄ h = 1 in the lowest level. mc² = hν = 1 produces m = 1 ⁄ c². Setting the maximum coordinate speed to c = 1 makes the minimum mass m = 1.

Emin = 1 mmin = 1

Number values:

import scipy.constants as sc
c = sc.c # 299792458.0
h = sc.h # 6.62607015e-34
t = h
x = ch = sc.c*sc.h # 1.9864458571489286e-25

It makes sense to set h = 1 and c = 1:

Then the minimal values are

c = 1 h = 1 t = 1 x = ch = 1 W = m = 1

One can continue like that for other quantities, like electrodynamic E and B.