Text Size

Stardrive

The whole point of Einstein's equivalence principle (in all its variations) is that gravity is not a force in the same way that the electromagnetism, or the weak and strong interactions give us forces. The fundamental idea is that we are weightless in free fall local inertial frames (on time like geodesics in curved spacetime without rotation about our centers of mass). Therefore, the force of gravity in the sense of Newton's 17th Century theory is locally eliminated. Therefore, whenever we feel weight we are in a locally accelerating non-inertial frame in curved spacetime. We need to objectively accelerate from some non-gravity force in order to stand still in curved spacetime like on the surface of Earth standing "still" on a scale. It's the electrical force that is providing the acceleration that we feel as weight, that moves the pointer on the scale. This is very counter-intuitive and its implication is lost even on PhD high-energy physicists and string theorists.

For example, when in a textbook you see the classical vacuum static spherically symmetric metric tensor field

gtt = (1 - rs/r)

grr = - (1 - rs/r)^-1

etc.

rs ~ total energy causing the 4D spacetime curvature

rs/r < 1

Those tensor components are only a representation relative to a privileged class of locally non-inertial accelerating "static" observers where some non-gravity force is keeping them at fixed r.

By the way r is essentially the square root of the Hawking-Bekenstein-'t Hooft-Susskind maximal thermodynamic hologram entropy/information of the 3D space that a closed 2Dsurface at fixed r surrounds. Any attempt to squeeze more information into that volume will cause a black hole event horizon to develop at that r. Area of the surrounding surface is always the Euclidean 4pir^2.

String theorists like to invoke 6 or seven extra space dimensions, but there is no real evidence for them nor for supersymmetry between fermions and bosons in 3D space. Perhaps the LHC will provide such evidence?  The idea is that timelike geodesics in the larger spacetime unify all the forces. This is a formal extension of Einstein's equivalence principle that would provide a consistent conceptual unification in the sense of the old Kaluza-Klein idea. 

The physics of the 2D event horizon hologram screens both for black holes and for the observer-dependent dark energy cosmology is anyonic with fractional quantum statistics. Somehow these surface anyons make 3D hologram images that would have supersymmetry that is badly broken in our world. No one has shown how this works in detail as yet.

Short excerpts under fair use for non-commercial educational purposes designed to stimulate the serious technology student to study the complete article.

"much of our discussion has been directly relevant to the measurement of mechanical nanoresonators, a topic attracting considerable recent attention. These nanoresonators are typically studied by coupling them either to electrical often superconducting
circuits or to optical cavities. A key goal is to achieve quantum-limited continuous position detection set by the quantum limit is in principle independent from being at low temperatures, it becomes interesting only when the systems are near their ground state; one could then, e.g., monitor the oscillator’s zero-point fluctuations ... Other important
directions in nanomechanics include the possibility of detecting quantum jumps in the state of a mechanical resonator via QND measurement of its energy ... Back-action evasion using a
microwave cavity detector coupled to a nanomechanical resonator was recently reported Hertzberg et al., 2010. Another area distinct from nanomechanics where rapid progress is being made is the readout of solid state qubits using microwave signals sent through cavities
whose transmission properties are controlled by the qubit. At the moment, one is close to achieving good fidelity single-shot QND readout, which is a prerequisite for a large number of applications in quantum information processing. The gradually growing information
about the qubit state is extracted from the measured noisy microwave signal trace, leading to a corresponding collapse of the qubit state. This process can also be described by conditional quantum evolution and quantum trajectories.

A promising method for superconducting qubit readout currently employed is a so-called latching measurement, where the hysteretic behavior of a strongly driven
anharmonic system e.g., a Josephson junction is exploited to toggle between two states depending on the qubit state Siddiqi et al., 2004; Lupas¸cu et al., 2006.
Although this is then no longer a linear measurement scheme and is therefore distinct from what was discussed discussed in this review, it can be turned into a linear amplifier
for a sufficiently weak input signal. An interesting and important open question is whether such a setup can reach the quantum limit on linear amplification.

Both qubit detection and mechanical measurements in electrical circuits would benefit from quantum-limited on-chip amplifiers. Such amplifiers are now being developed using the tools of circuit quantum electrodynamics, employing Josephson junctions or SQUIDs coupled to microwave transmission line cavities Bergeal et al., 2008; Castellanos-Beltran et al., 2008. Such an amplifier has already been used to perform continuous position
detection with a measurement imprecision below the SQL level Teufel et al., 2009."

Introduction to quantum noise, measurement, and amplification

A. A. Clerk*
Department of Physics, McGill University, 3600 rue University Montréal, Quebec, Canada
H3A 2T8
M. H. Devoret
Department of Applied Physics, Yale University, P.O. Box 208284, New Haven,
Connecticut 06520-8284, USA
S. M. Girvin
Department of Physics, Yale University, P.O. Box 208120, New Haven,
Connecticut 06520-8120, USA
Florian Marquardt
Department of Physics, Center for NanoScience, and Arnold Sommerfeld Center for
Theoretical Physics, Ludwig-Maximilians-Universität Mu?nchen, Theresienstrasse 37,
D-80333 Mu?nchen, Germany
R. J. Schoelkopf
Department of Applied Physics, Yale University, P.O. Box 208284, New Haven,
Connecticut 06520-8284, USA
Published 15 April 2010
"The topic of quantum noise has become extremely timely due to the rise of quantum information physics and the resulting interchange of ideas between the condensed matter and atomic, molecular, optical–quantum optics communities. This review gives a pedagogical introduction to the physics of quantum noise and its connections to quantum measurement and quantum amplification. After introducing quantum noise spectra and methods for their detection, the basics of weak continuous measurements are described. Particular attention is given to the treatment of the standard quantum limit on linear amplifiers and position detectors within a general linear-response framework. This
approach is shown how it relates to the standard Haus-Caves quantum limit for a bosonic amplifier known in quantum optics and its application to the case of electrical circuits is illustrated, including mesoscopic detectors and resonant cavity detectors.
DOI: 10.1103/RevModPhys.82.1155 PACS numbers: 72.70.m
CONTENTS
I. Introduction 1156
II. Quantum Noise Spectra 1159
A. Introduction to quantum noise 1159
B. Quantum spectrum analyzers 1161
III. Quantum Measurements 1162
A. Weak continuous measurements 1164
B. Measurement with a parametrically coupled
resonant cavity 1164
1. QND measurement of the state of a qubit
using a resonant cavity 1167
2. Quantum limit relation for QND qubit state
detection 1168
3. Measurement of oscillator position using a
resonant cavity 1169
IV. General Linear-Response Theory 1173
A. Quantum constraints on noise 1173
1. Heuristic weak-measurement noise
constraints 1173
2. Generic linear-response detector 1174
3. Quantum constraint on noise 1175
4. Evading the detector quantum noise
inequality 1176
B. Quantum limit on QND detection of a qubit 1177
V. Quantum Limit on Linear Amplifiers and Position
Detectors 1178
A. Preliminaries on amplification 1178
B. Standard Haus-Caves derivation of the quantum
limit on a bosonic amplifier 1179
C. Nondegenerate parametric amplifier 1181
1. Gain and added noise 1181
2. Bandwidth-gain trade-off 1182
3. Effective temperature 1182
D. Scattering versus op-amp modes of operation 1183
E. Linear-response description of a position detector 1185
1. Detector back-action 1185
2. Total output noise 1186
3. Detector power gain 1186
*clerk@physics.mcgill.ca 4. Simplifications for a quantum-ideal detector 1188

"having a near-quantum-limited detector would allow one to continuously monitor the
quantum zero-point fluctuations of a mechanical resonator. It is also necessary to have a quantum-limited detector is for such tasks as single-spin NMR detection.... as well as gravitational wave detection ... Particular attention is given to the quantum mechanics of transmission lines and driven electromagnetic cavities, topics that are especially relevant given recent experiments making use of microwave stripline resonators. ...

TABLE II. Contents of online appendix material. Page numbers
refer to the supplementary material.
Section Page
A. Basics of classical and quantum noise 1
B. Quantum spectrum analyzers: further details 4
C. Modes, transmission lines and classical
input-output theory
8
D. Quantum modes and noise of a transmission line 15
E. Back-action and input-output theory for driven
damped cavities
18
F. Information theory and measurement rate 29
G. Number phase uncertainty 30
H. Using feedback to reach the quantum limit 31
I. Additional technical details 34

Of course one can learn a lot from Feynman's diagram approach starting with a spin 2 symmetric small tensor field on a non-dynamical global Minkowski background. Like the BCS superconductor Feynman has to sum an infinite series of a restricted class of diagrams to get a QFT non-perturbative version of Einstein's 1916 classical field equations

Guv(curved space-time) + (G-string tension)^-1Tuv(all non-gravity fields) = 0

where now we know from Type 1a supernovae data et-al that the hologram idea of 't Hooft & Susskind seems to work! i.e.,

dark energy density in our past light cone ~ (area of our subjective future event horizon in our future light cone)^-1

in Igor Novikov's globally self-consistent loop in time needing the Wheeler-Feynman principle with our future horizon as the total absorber giving net retarded causation from the John Cramer transactions (see also Mike Ibison's papers).

Instead the spin 1 gravity tetrad fields are emergent macro-quantum coherent order parameters from the post-inflation vacuum superconductor possibly deeply connected to the 8 gluon Goldstone phases conjugate to the 8 color charges of SU3 QCD. The tetrads are EPR entangled states of pairs of Penrose-Rindler qubits in the pre-geometry (J.A. Wheeler)
Einstein's spin 2 gravity field used by Feynman in his QFT approach are composites of pairs of tetrads, but in QM

1 + 1 = 2 + 1 + 0

also like entangled Cooper pairs in a BCS superconductor there are states beyond the S-wave relative orbital state.

From: cadmo...
Date: May 27, 2010 5:46:54 PM PDT
To: adastra1@me.com
Subject: Fwd: [Sarfatti_Physics_Seminars] Are string theorists confused about gravity as a force to be unified with the others?

"Dr. Sarfatti,
 
Muito obrigado! Great abstract. I lol'd when I read your Subject heading."
 
Jonah L...

May 26

Curvature, Torsion and the Qubit Spinor Pre-Geometry

 

Commentary on the Penrose-Rindler Formalism

 

Jack Sarfatti

 

Abstract

 

The physics of spacetime and matter fields is unified by the principle of local gauge invariance applied to symmetry groups of different kinds of frame transformations that leave the global dynamical classical actions invariant. Indeed, the geometrodynamic field embodied in the spin 1 gravity tetrads can be looked at as simply another dynamical field on a formal global Minkowski spacetime that is not generally directly observable since the behavior of clocks and measuring rods is controlled by the geometrodynamical field in a universal way consistent with the strongest form of Einstein’s equivalence principle. The most fundamental quantity here is the connection field that is always a non-tensor field relative to the global symmetry group it localizes. Einstein’s 1916 General Relativity (GR) is, from this point of view, simply the localization of the universal globally rigid abelian 4-parameter translation Lie group T4 whose Lie Algebra is the total energy-momentum 4-vector of the matter fields on the globally flat Minkowski spacetime of Einstein’s 1905 Special Relativity (GR). The electromagnetic-weak and strong sub-nuclear forces are essentially the connections for parallel transport of tensors and spinors in the internal fiber spaces of the U1, SU2, & SU3 groups with projections onto world lines (in the point particle low energy limit) in 4D spacetime. Unlike T4, U1, SU2, SU3 are not universal. The connections for the internal symmetry groups are essentially the gauge potentials with their covariant “curvature” curls as the “forces”. The situation is qualitatively different in the case of Einstein’s 1916 GR where the T4-based torsionless Levi-Civita Christoffel Symbol connection is Newton’s “gravity force” that is locally equivalent to the inertial g-force of an accelerating detector (aka LNIF). For example, we standing still on the surface of the Earth, must accelerate in order not to move in curved spacetime. The inability of even some PhD physicists to really understand this has led to a lot of confusion especially among naïve high-energy particle physicists attempting to unify the “four forces” as well as some relativists and philosophers of physics who try to argue that Einstein was wrong in the way he formulated the equivalence principle.



 

May 23

Holographic Conjecture

Posted by: JackSarfatti |
Tagged in: Untagged 

"The holographic principle—dating back to ’t Hooft

1985 and Susskind 1995—goes even further, and suggests

that generally all information that is contained in a

volume of space can be represented by information that

resides on the boundary of that region. For an extensive

review, see Bousso 2002.the holographic principle Bousso, 2002

—the conjecture that

the information contained in a volume of space can

be represented by a theory which lives in the boundary

of that region—could be related to the area law

behavior of the entanglement entropy in microscopic

theories. ... 

Area laws also say something on

how quantum correlations are distributed in ground

states of local quantum many-body systems. Interactions

in quantum many-body systems are typically

local, which means that systems interact only over a

short distance with a finite number of neighbors. The

emergence of an area law then provides support for

the intuition that short ranged interactions require

that quantum correlations between a distinguished

region and its exterior are established via its boundary

surface. That a strict area law emerges is by no

means obvious from the decay of two-point correlators,

as we will see. Quantum phase transitions are

governed by quantum fluctuations at zero temperature,

so it is more than plausible to observe signatures

of criticality on the level of entanglement and

quantum correlations. This situation is now particularly

clear in one-dimensional 1D systems ...

It is hence not the decay behavior

of correlation functions as such that matters here,

but in fact the scaling of entanglement.

• Topological entanglement entropy: The topological

entanglement entropy is an indicator of topological

order a new kind of order in quantum many-body

systems that cannot be described by local order parameters

... Here a global feature is detected by

means of the scaling of geometric entropies.

...

In critical models the correlation length diverges and

the models become scale invariant and allow for a description

in terms of conformal field theories. According

to the universality hypothesis, the microscopic details

become irrelevant for a number of key properties. These

universal quantities then depend only on basic properties

such as the symmetry of the system, or the spatial

dimension. Models from the same universality class are

characterized by the same fixed-point Hamiltonian under

renormalization transformations, which is invariant

under general rotations. Conformal field theory then describes

such continuum models, which have the symmetry

of the conformal group including translations, rotations,

and scalings. The universality class is

characterized by the central charge c, a quantity that

roughly quantifies the “degrees of freedom of the

theory.” For free bosons c=1, whereas the Ising universality

class has c=1/2.

Once a model is known to be described by a conformal

field theory, powerful methods are available to compute

universal properties, and entanglement entropies

or even the full reduced spectra of subsystems. ...

On both sides of a

critical point in a system undergoing a quantum phase

transition, the quantum many-body system may have a

different kind of quantum order; but this order is not

necessarily one that is characterized by a local order parameter:

In systems of, say, two spatial dimensions, topological

order may occur. Topological order manifests

itself in a degeneracy of the ground-state manifold that

depends on the topology of the entire system and the

quasiparticle excitations then show an exotic type of

anyonic quasiparticle statistics. These are features that

make topologically ordered systems interesting for

quantum computation, when exactly this degeneracy can

be exploited in order to achieve a quantum memory robust

against local fluctuations. They even allow in theory

for robust instances of quantum computation, then referred

to as topological quantum computation"

Read More...

 So what is the actual technical problem here?

"The Question is: What is The Question?" John A. Wheeler
All gravity fields are approximately uniform in a small region of 4D space-time from Taylor's theorem of calculus.
This is all that is needed in Einstein's 1916 GR. 
We never need to invoke a global static uniform gravity field.
Is such a field even possible? --  one might ask.
For static LNIFs Newton's idea of a global static gravity field is deconstructed as a possible Einstein metric field with observer-dependent representation for the vacuum outside the source Tuv
guv(static LNIF) = (1 + VNewton/c^2)(cdt)^2 - (1 + VNewton/c^2)^-1dz^2 - dx^2 - dy^2
where (note no factor of 2 in above model - unlike central force problem 1/r potential)
VNewton =  gz
the g-force is then
- g = -dVNewton /dz
directed back toward z = 0
there is no event horizon in this "dark matter" model and, of course, the g-force is independent of the rest mass of the test particles.
If the g-force is repulsive away from z = 0 then there is an event horizon for a "dark energy" slab vacuum domain wall! (Rindler?)
The source must be something like an infinite uniform density mass plane at z = 0 in the x-y plane (analog to electrical capacitor problem)
The problem is whether the above intuitive guess at a solution is what one gets from Einstein's field equation
Guv + kTuv = 0
where Tuv corresponds to a Dirac delta function &(z) uniform density.
On Mar 31, 2010, at 11:20 PM, Paul Zielinski wrote:

Jack, I believe you've just scored yet another of your world famous "own goals" here.

When he refers to a "homogeneous" gravitational field, Einstein is not talking about a uniform frame acceleration field.
He is talking about an *actual* gravity field of uniform field strength. 

There is no problem with defining such a field operationally, since test object acceleration can always be measured at 
every point at *zero test object velocity*, eliminating any SR-related effects. 

So what Einstein has in mind here when he uses the term "homogeneous to first order" is the non-vanishing curvature 
associated with typical gravity fields produced by matter.

Now it is nevertheless true that a RIndler frame (relativistic accelerating frame of reference) does exhibit such SR-type 
effects -- but this is just another argument against Einstein's proposed principle, since it ensures that the phenomena observed 
in such a frame differ from those observed from a non-accelerating frame even in the presence of a perfectly homogeneous 
gravity field (Einstein's best case).

So yes Einstein was later forced to retreat even from this version of the principle, even given his best case of a perfectly 
homogeneous ("certain kind of") gravity field compared with a uniform acceleration field, eventually restricting the principle to what 
we like to call "local" observations, irrespective of the question of spacetime curvature.

You don't seem to realize that this is an argument *against* Einstein's original concept of equivalence, not for it.

In any case, even if one is restricted to pure local observations, the principle as stated still does not work. Why? Because you 
cannot 
recover non-tidal gravitational acceleration -- a locally observable phenomenon -- from *any* kind of frame acceleration, 
either
 globally or locally!

You can always bring a test object as close as you like to a source boundary, and locally measure its acceleration with respect
to the source. Such locally observable gravitational acceleration will not be observed in *any* kind of frame acceleration field. Which
means that Einstein's proposed principle as stated is simply false: the laws observed even in a perfectly homogeneous gravity
field are not the same as those observed in a homogeneous gravitational field -- not even approximately.

Vilenkin's vacuum domain wall solutions, in which the vacuum geometry is completely Riemann flat,  show that this kind of situation 
does exist in 1916 GR. A test object released near such a gravitational source will experience locally observable gravitational 
acceleration with respect to the source, which will not be observed in *any* pure frame acceleration field with the gravitational source 
switched off (by which I mean a Rindler frame in a Minkowski spacetime -- a pure frame acceleration field). 

So the only way to get Einstein's principle as stated to work is to ignore the phenomenon of gravitational acceleration. But what kind of a
"theory of gravity" can be based on such a principle?

My answer here is simple: Einstein's version of the equivalence principle is simply not supported by his 1916 theory of gravity. It is
simply a figment of Einstein's fevered imagination.

Which is what I've been saying all along.

Z.


On Wed, Mar 31, 2010 at 6:41 PM, JACK SARFATTI <sarfatti@pacbell.net> wrote:
As I have been trying to explain to Zielinski without success is that such a global uniform gravity field does not exist because of special relativity time dilation and length contraction - I mean it does not exist in same sense that it would in Newton's gravity theory using only the v/c ---> 0 Galilean group limit of the Lorentz subgroup of the Poincare group. Einstein was perfectly aware of this in the quote Zielinski cites - Zielinski simply does not understand Einstein's text in my opinion.
On Mar 31, 2010, at 6:23 PM, Paul Murad wrote:

A "paradoxical" property

Note that Rindler observers with smaller constant x coordinate are accelerating harder to keep up! This may seem surprising because in Newtonian physics, observers who maintain constant relative distance must share thesame acceleration. But in relativistic physics, we see that the trailing endpoint of a rod which is accelerated by some external force (parallel to its symmetry axis) must accelerate a bit harder than the leading endpoint, or else it must ultimately break. This is a manifestation of Lorentz contraction. As the rod accelerates its velocity increases and its length decreases. Since it is getting shorter, the back end must accelerate harder than the front. This leads to a differential equation showing, that at some distance, the acceleration of the trailing end diverges, resulting in the #The Rindler horizon.
This phenomenon is the basis of a well known "paradox". However, it is a simple consequence of relativistic kinematics. One way to see this is to observe that the magnitude of the acceleration vector is just the path curvature of the corresponding world line. But the world lines of our Rindler observers are the analogs of a family of concentric circles in the Euclidean plane, so we are simply dealing with the Lorentzian analog of a fact familiar to speed skaters: in a family of concentric circles, inner circles must bend faster (per unit arc length) than the outer ones.
Paul:

Okay. I just want to make sure we are on the same sheet of music...
Ufoguy...

 

hologram dark energy density ~ (c^4/G)(1/asymptotic constant area of future horizon) ~ 10^-29 gm/cc
Where our observer-dependent future horizon hologram 2D +1 cosmic computer screen projects the interior 3D + 1 bulk matter fields as a retro-causal image is effectively the total Wheeler-Feynman absorber since all the interior bulk scatterings are merely hologram images of its information processing IT FROM BIT.

On May 22, 2010, at 1:58 PM, michael ibison wrote:

"Dear Jack

Thank you for your positive (re-?) appraisal"
Jack Sarfatti comments: I did not listen closely to your short talk at Retrocausal Workshop AAAS USD June 2006 was it? Also at that time I did not know about the above picture in Tamara Davis's 2004 PhD and did not connect the inverse area of the future horizon with the dark energy density. Indeed, I was not even aware of the future boundary then, thinking like Susskind et-al only of past particle horizon. So I could not have made the connection back then - could not have connected the dots. It was only when Creon Levit showed me Tamara's thesis in Dec 2008 in the course of writing up the DICE 2008 paper that the idea dawned on me. I think Hal Puthoff then reminded me of your work as a consequence? But it still did not sink in until Nick Herbert started objecting to my idea and also some comments by James Woodward who likes the idea because it is a way of formulating his Mach's Principle approach - indeed, I think Mach's Principle correctly formulated is a primitive form of the Hologram Conjecture of 't Hooft & Susskind, but they don't seem to see the essential role of Wheeler-Feynman retro-causation ensuring apparent net retarded causation as, e.g. Cramer explains in the transactional approach.
I independently thought of the idea you had already suggested that the classical stretching of the de Broglie waves from expansion of space has the Ehrenfest theorem interpretation as the statistical mean of a sequence of particle inelastic collisions with the geometrodynamic field (not seen in a static field). I picture that in terms of the spin 1 gravity tetrad fields (square roots of the historical Einstein spin 2 metric tensor field).

Continuation of Ibison's remarks:
"of my efforts to investigate the future conformal horizon. It is true that most but not all my effort was to cast the story in traditional RW coordinates, so the conformal horizon took a back seat in the AAAS paper. I was aware however of the importance of the latter, providing as it does, an alternative avenue for understanding the role of the evolution of the scale factor on the EM arrow of time. The conformal view was written up (though in a somewhat disguised way) in an arxiv posting some time ago: .

Even so, I have work to do before I can feel confident that this is all viable. At Vigier VII I will talk about the effect of the conformal boundary in more detail, giving a couple of options that formalize the boundary condition.

I know that you have for a while been talking about a Holographic Principle but had not given it the attention it probably deserved. As a result of my more recent efforts I do now see a role for that way of thinking. So good for you (if this turns out to be correct)."

Cheers,

Michael Ibison

 

May 22

Area = Entropy Hologram & Entanglement

Posted by: JackSarfatti |
Tagged in: Untagged 

 

REVIEWS OF MODERN PHYSICS, VOLUME 82, JANUARY–MARCH 2010

 

Area laws for the entanglement entropy

J. Eisert

Institute of Physics and Astronomy, University of Potsdam, 14469 Potsdam, Germany; Blackett Laboratory, Imperial College London, Prince Consort Road, London SW7 2BW, United Kingdom;and Institute for Mathematical Sciences, Imperial College London, Exhibition Road, London SW7 2PG, United Kingdom

M. Cramer and M. B. Plenio

Blackett Laboratory, Imperial College London, Prince Consort Road, London SW7 2BW, United Kingdom and Institut fu?r Theoretische Physik, Albert-Einstein-Allee 11, Universitat Ulm, D-89069 Ulm, Germany, Published 4 February 2010

"Physical interactions in quantum many-body systems are typically local: Individual constituents interact mainly with their few nearest neighbors. This locality of interactions is inherited by a decay of correlation functions, but also reflected by scaling laws of a quite profound quantity: the entanglement entropy of ground states. This entropy of the reduced state of a subregion often merely grows like the boundary area of the subregion, and not like its volume, in sharp contrast with an expected extensive behavior. Such “area laws” for the entanglement entropy and related quantities have received considerable attention in recent years. They emerge in several seemingly unrelated fields, in the context of black hole physics, quantum information science, and quantum many-body physics where they have important implications on the numerical simulation of lattice models. In this Colloquium the current status of area laws in these fields is reviewed. Center stage is taken by rigorous results on lattice models in one and higher spatial dimensions. The differences and similarities between bosonic and fermionic models are stressed, area laws are related to the velocity of information propagation in quantum lattice models, and disordered systems, nonequilibrium situations, and topological entanglement entropies are discussed. These questions are considered in classical and quantum systems, in their ground and thermal states, for a variety of correlation measures. A significant proportion is devoted to the clear and quantitative connection between the entanglement content of states and the possibility of their efficient numerical simulation. Matrix-product states, higher-dimensional analogs, and variational sets from entanglement renormalization are also discussed and the paper is concluded by highlighting the implications of area laws on quantifying the effective degrees of freedom that need to be considered in simulations of quantum states. ...

In classical physics concepts of entropy quantify the

extent to which we are uncertain about the exact state of

a physical system at hand or, in other words, the amount

of information that is lacking to identify the microstate

of a system from all possibilities compatible with the

macrostate of the system. If we are not quite sure what

microstate of a system to expect, notions of entropy will

reflect this lack of knowledge. Randomness, after all, is

always and necessarily related to ignorance about the

state. ... In quantum mechanics positive entropies may arise

even without an objective lack of information. ...

In contrast to thermal states this entropy does not

originate from a lack of knowledge about the microstate

of the system. Even at zero temperature we encounter a

nonzero entropy. This entropy arises because of a fundamental

property of quantum mechanics: entanglement.

This quite intriguing trait of quantum mechanics

gives rise to correlations even in situations where the

randomness cannot be traced back to a mere lack of

knowledge. The mentioned quantity, the entropy of  a

subregion, is called entanglement entropy or geometric

entropy and in quantum information entropy of entanglement,

which represents an operationally defined entanglement

measure for pure states ...

one thinks less of detailed

properties, but is rather interested in the scaling of the

entanglement entropy when the distinguished region

grows in size. In fact, for quantum chains this scaling of

entanglement as genuine quantum correlations—a priori

very different from the scaling of two-point correlation

functions—reflects to a large extent the critical behavior

of the quantum many-body system, and shares some relationship

to conformal charges.

At first sight one might be tempted to think that the

entropy of a distinguished region I will always possess an

extensive character. Such a behavior is referred to as a

volume scaling and is observed for thermal states. Intriguingly,

for typical ground states, however, this is not

at all what one encounters: Instead, one typically finds

an area law, or an area law with a small often logarithmic

correction: This means that if one distinguishes a

region, the scaling of the entropy is merely linear in the

boundary area of the region. The entanglement entropy

is then said to fulfill an area law. It is the purpose of this

Colloquium to review studies on area laws and the scaling

of the entanglement entropy in a nontechnical manner."


 

 

The problem of why the Pioneer Anomaly's anomalous acceleration is the same order of magnitude as the square root of the dark energy density though in the wrong direction has bugged me since 2002. Of course both dark energy and dark matter density are same order of magnitude .73 vs .23 of critical density for a flat space universe found in the inflation model. But why large-scale cosmology shows up on the small scale of our solar system is the problem and why there is a hollow volume from Sun out to orbits of the outer planets is another mystery.

 

REVIEWS OF MODERN PHYSICS, VOLUME 82, JANUARY–MARCH 2010

 

 

Influence of global cosmological expansion on local dynamics and kinematics

Matteo Carrera*

Institute of Physics, University of Freiburg, Hermann-Herder-Straße 3, D-79104 Freiburg,

Germany

Domenico Giulini

Institute for Theoretical Physics, University of Hanover, Appelstraße 2, D-30167 Hannover,

Germany

Published 28 January 2010

"Attempts to estimate the influence of global cosmological expansion on local systems are reviewed. Here “local” is taken to mean that the sizes of the considered systems are much smaller than cosmologically relevant scales. For example, such influences can affect orbital motions as well as configurations of compact objects, like black holes. Also discussed are how measurements based on the exchange of electromagnetic signals of distances, velocities, etc. of moving objects are influenced. As an application, orders of magnitude of such effects are compared with the scale set by the apparently anomalous acceleration of the Pioneer 10 and 11 spacecrafts, which is 10^−9 m/s^2. There is no reason to believe that the latter is of cosmological origin. However, the general problem of gaining a qualitative and quantitative understanding of how the cosmological dynamics influences local systems remains challenging, with only partial clues being so far provided by exact solutions to the field equations of general relativity."

 

That our dark energy future event horizon accelerating the expansion of the universe is the Wheeler-Feynman total absorber giving us Wheeler's IT FROM BIT with "retrocausality without retrocausality."

"In the direct action theory the EM fields of a single source are not exclusively retarded but are time-symmetric. The appearance of pure retardation is now explained as the result of interference by time-symmetric exchanges with the cosmological gravitational field. Just as the effect of a dielectric continuum can be regarded as the final result of a series of absorptions and re-emissions on the microscopic level [23], the macroscopic exchanges with the gravitational field implied by (34) can be interpreted likewise. If each exchange is subject to the constraint that it be time- symmetric, then the gravitational damping plays the same role as do the future absorbers in the Wheeler-Feynman theory. Anti-phase advanced waves from these exchanges arrive back at the current source to re-enforce the retarded component and cancel the advanced component. Consequently these proposed cosmological boundary conditions guarantee that every ‘photon’ (of which, strictly, there are now none) will be absorbed. The absorption is not by matter, but the whole system - which includes a term for the energy in the cosmological gravitational field. A ‘prediction’ of this implementation of the cosmological boundary condition is that, if the universe were not expanding, then there would be no apparent predominance of retarded radiation. Consequently the future state of the universe is felt in the present. If these arguments stand then the direct action theory is validated (and therefore preferred), and advanced potentials in the sense of (15) and (16) are ubiquitous."

The detection of dark energy shows that retarded light emitted toward the future is infinitely redshifted after traveling only a finite distance in our accelerating expanding space. Our future space time geometry is qualitatively different from our past inflation to hot Big Bang geometry, hence the asymmetric Arrow of Time explaining why we age as the universe gets bigger.

Ibison's last section - below is now obsolete because he was not aware of Tamara Davis's 2004 PhD in 2006 that solves his problem here:

Reverse Causation
"A few words on the relevance of advanced potentials to the theme of this conference: The use of the phrase ‘reverse causation’ implies that one can meaningfully (i.e. semantically, if not in practice physically) separate the notion of logical casualty from temporal ordering. In order to do that, one must be able to identify a (more or less universal) property that distinguishes between a cause and an effect that is not the temporal order. Some arguments have been given here in support of the rehabilitation of the advanced potential. If one wished to identify all currents as causes and all potentials as effects, then absorption of radiation is an example of reverse causation. Since the most mathematically efficient description of absorption is through (exclusively) advanced potentials (Eq. (8) with Aout ), one may choose to associate reverse causation with the predominance of advanced potentials in an appropriately defined maximally efficient description. But no connection with the flow of entropy has been established in this document. As a result of considerations in the section ‘The Cosmological Boundary Condition’, it is not clear that entropy necessarily increases in Cosmological time, even in the event that retarded potentials turn out to be predominant in the ‘most efficient’ description of EM processes."

 

Michael Ibison
Institute for Advanced Studies at Austin 4030 West Braker Lane, suite 300, Austin, Texas 78759, USA.
Submitted for publication in Proceedings of AAAS Conference on Reverse Causation, 2006.
Abstract. "Advanced electromagnetic potentials are indigenous to the classical Maxwell theory. Generally however they are deemed undesirable and are forcibly excluded, destroying the theory’s inherent time-symmetry. We investigate the reason for this, pointing out that it is not necessary and in some cases is counter-productive. We then focus on the direct-action theory in which the advanced and retarded contributions are present symmetrically, with no opportunity supplement the particular integral solution of the wave equation with an arbitrary complementary function. One then requires a plausible explanation for the observed broken symmetry that, commonly, is understood cannot be met by the Wheeler-Feynman mechanism because the necessary boundary condition cannot be satisfied in acceptable cosmologies. We take this opportunity to argue that the boundary condition is already met by all expanding cosmologies simply as a result of cosmological red-shift. A consequence is that the cosmological and thermodynamic arrows of time can be equated, the direct action version of EM is preferred, and that advanced potentials are ubiquitous."