Text Size



 So what is the actual technical problem here?

"The Question is: What is The Question?" John A. Wheeler
All gravity fields are approximately uniform in a small region of 4D space-time from Taylor's theorem of calculus.
This is all that is needed in Einstein's 1916 GR. 
We never need to invoke a global static uniform gravity field.
Is such a field even possible? --  one might ask.
For static LNIFs Newton's idea of a global static gravity field is deconstructed as a possible Einstein metric field with observer-dependent representation for the vacuum outside the source Tuv
guv(static LNIF) = (1 + VNewton/c^2)(cdt)^2 - (1 + VNewton/c^2)^-1dz^2 - dx^2 - dy^2
where (note no factor of 2 in above model - unlike central force problem 1/r potential)
VNewton =  gz
the g-force is then
- g = -dVNewton /dz
directed back toward z = 0
there is no event horizon in this "dark matter" model and, of course, the g-force is independent of the rest mass of the test particles.
If the g-force is repulsive away from z = 0 then there is an event horizon for a "dark energy" slab vacuum domain wall! (Rindler?)
The source must be something like an infinite uniform density mass plane at z = 0 in the x-y plane (analog to electrical capacitor problem)
The problem is whether the above intuitive guess at a solution is what one gets from Einstein's field equation
Guv + kTuv = 0
where Tuv corresponds to a Dirac delta function &(z) uniform density.
On Mar 31, 2010, at 11:20 PM, Paul Zielinski wrote:

Jack, I believe you've just scored yet another of your world famous "own goals" here.

When he refers to a "homogeneous" gravitational field, Einstein is not talking about a uniform frame acceleration field.
He is talking about an *actual* gravity field of uniform field strength. 

There is no problem with defining such a field operationally, since test object acceleration can always be measured at 
every point at *zero test object velocity*, eliminating any SR-related effects. 

So what Einstein has in mind here when he uses the term "homogeneous to first order" is the non-vanishing curvature 
associated with typical gravity fields produced by matter.

Now it is nevertheless true that a RIndler frame (relativistic accelerating frame of reference) does exhibit such SR-type 
effects -- but this is just another argument against Einstein's proposed principle, since it ensures that the phenomena observed 
in such a frame differ from those observed from a non-accelerating frame even in the presence of a perfectly homogeneous 
gravity field (Einstein's best case).

So yes Einstein was later forced to retreat even from this version of the principle, even given his best case of a perfectly 
homogeneous ("certain kind of") gravity field compared with a uniform acceleration field, eventually restricting the principle to what 
we like to call "local" observations, irrespective of the question of spacetime curvature.

You don't seem to realize that this is an argument *against* Einstein's original concept of equivalence, not for it.

In any case, even if one is restricted to pure local observations, the principle as stated still does not work. Why? Because you 
recover non-tidal gravitational acceleration -- a locally observable phenomenon -- from *any* kind of frame acceleration, 
 globally or locally!

You can always bring a test object as close as you like to a source boundary, and locally measure its acceleration with respect
to the source. Such locally observable gravitational acceleration will not be observed in *any* kind of frame acceleration field. Which
means that Einstein's proposed principle as stated is simply false: the laws observed even in a perfectly homogeneous gravity
field are not the same as those observed in a homogeneous gravitational field -- not even approximately.

Vilenkin's vacuum domain wall solutions, in which the vacuum geometry is completely Riemann flat,  show that this kind of situation 
does exist in 1916 GR. A test object released near such a gravitational source will experience locally observable gravitational 
acceleration with respect to the source, which will not be observed in *any* pure frame acceleration field with the gravitational source 
switched off (by which I mean a Rindler frame in a Minkowski spacetime -- a pure frame acceleration field). 

So the only way to get Einstein's principle as stated to work is to ignore the phenomenon of gravitational acceleration. But what kind of a
"theory of gravity" can be based on such a principle?

My answer here is simple: Einstein's version of the equivalence principle is simply not supported by his 1916 theory of gravity. It is
simply a figment of Einstein's fevered imagination.

Which is what I've been saying all along.


On Wed, Mar 31, 2010 at 6:41 PM, JACK SARFATTI <sarfatti@pacbell.net> wrote:
As I have been trying to explain to Zielinski without success is that such a global uniform gravity field does not exist because of special relativity time dilation and length contraction - I mean it does not exist in same sense that it would in Newton's gravity theory using only the v/c ---> 0 Galilean group limit of the Lorentz subgroup of the Poincare group. Einstein was perfectly aware of this in the quote Zielinski cites - Zielinski simply does not understand Einstein's text in my opinion.
On Mar 31, 2010, at 6:23 PM, Paul Murad wrote:

A "paradoxical" property

Note that Rindler observers with smaller constant x coordinate are accelerating harder to keep up! This may seem surprising because in Newtonian physics, observers who maintain constant relative distance must share thesame acceleration. But in relativistic physics, we see that the trailing endpoint of a rod which is accelerated by some external force (parallel to its symmetry axis) must accelerate a bit harder than the leading endpoint, or else it must ultimately break. This is a manifestation of Lorentz contraction. As the rod accelerates its velocity increases and its length decreases. Since it is getting shorter, the back end must accelerate harder than the front. This leads to a differential equation showing, that at some distance, the acceleration of the trailing end diverges, resulting in the #The Rindler horizon.
This phenomenon is the basis of a well known "paradox". However, it is a simple consequence of relativistic kinematics. One way to see this is to observe that the magnitude of the acceleration vector is just the path curvature of the corresponding world line. But the world lines of our Rindler observers are the analogs of a family of concentric circles in the Euclidean plane, so we are simply dealing with the Lorentzian analog of a fact familiar to speed skaters: in a family of concentric circles, inner circles must bend faster (per unit arc length) than the outer ones.

Okay. I just want to make sure we are on the same sheet of music...


hologram dark energy density ~ (c^4/G)(1/asymptotic constant area of future horizon) ~ 10^-29 gm/cc
Where our observer-dependent future horizon hologram 2D +1 cosmic computer screen projects the interior 3D + 1 bulk matter fields as a retro-causal image is effectively the total Wheeler-Feynman absorber since all the interior bulk scatterings are merely hologram images of its information processing IT FROM BIT.

On May 22, 2010, at 1:58 PM, michael ibison wrote:

"Dear Jack

Thank you for your positive (re-?) appraisal"
Jack Sarfatti comments: I did not listen closely to your short talk at Retrocausal Workshop AAAS USD June 2006 was it? Also at that time I did not know about the above picture in Tamara Davis's 2004 PhD and did not connect the inverse area of the future horizon with the dark energy density. Indeed, I was not even aware of the future boundary then, thinking like Susskind et-al only of past particle horizon. So I could not have made the connection back then - could not have connected the dots. It was only when Creon Levit showed me Tamara's thesis in Dec 2008 in the course of writing up the DICE 2008 paper that the idea dawned on me. I think Hal Puthoff then reminded me of your work as a consequence? But it still did not sink in until Nick Herbert started objecting to my idea and also some comments by James Woodward who likes the idea because it is a way of formulating his Mach's Principle approach - indeed, I think Mach's Principle correctly formulated is a primitive form of the Hologram Conjecture of 't Hooft & Susskind, but they don't seem to see the essential role of Wheeler-Feynman retro-causation ensuring apparent net retarded causation as, e.g. Cramer explains in the transactional approach.
I independently thought of the idea you had already suggested that the classical stretching of the de Broglie waves from expansion of space has the Ehrenfest theorem interpretation as the statistical mean of a sequence of particle inelastic collisions with the geometrodynamic field (not seen in a static field). I picture that in terms of the spin 1 gravity tetrad fields (square roots of the historical Einstein spin 2 metric tensor field).

Continuation of Ibison's remarks:
"of my efforts to investigate the future conformal horizon. It is true that most but not all my effort was to cast the story in traditional RW coordinates, so the conformal horizon took a back seat in the AAAS paper. I was aware however of the importance of the latter, providing as it does, an alternative avenue for understanding the role of the evolution of the scale factor on the EM arrow of time. The conformal view was written up (though in a somewhat disguised way) in an arxiv posting some time ago: .

Even so, I have work to do before I can feel confident that this is all viable. At Vigier VII I will talk about the effect of the conformal boundary in more detail, giving a couple of options that formalize the boundary condition.

I know that you have for a while been talking about a Holographic Principle but had not given it the attention it probably deserved. As a result of my more recent efforts I do now see a role for that way of thinking. So good for you (if this turns out to be correct)."


Michael Ibison


May 22

Area = Entropy Hologram & Entanglement

Posted by: JackSarfatti |
Tagged in: Untagged 




Area laws for the entanglement entropy

J. Eisert

Institute of Physics and Astronomy, University of Potsdam, 14469 Potsdam, Germany; Blackett Laboratory, Imperial College London, Prince Consort Road, London SW7 2BW, United Kingdom;and Institute for Mathematical Sciences, Imperial College London, Exhibition Road, London SW7 2PG, United Kingdom

M. Cramer and M. B. Plenio

Blackett Laboratory, Imperial College London, Prince Consort Road, London SW7 2BW, United Kingdom and Institut fu?r Theoretische Physik, Albert-Einstein-Allee 11, Universitat Ulm, D-89069 Ulm, Germany, Published 4 February 2010

"Physical interactions in quantum many-body systems are typically local: Individual constituents interact mainly with their few nearest neighbors. This locality of interactions is inherited by a decay of correlation functions, but also reflected by scaling laws of a quite profound quantity: the entanglement entropy of ground states. This entropy of the reduced state of a subregion often merely grows like the boundary area of the subregion, and not like its volume, in sharp contrast with an expected extensive behavior. Such “area laws” for the entanglement entropy and related quantities have received considerable attention in recent years. They emerge in several seemingly unrelated fields, in the context of black hole physics, quantum information science, and quantum many-body physics where they have important implications on the numerical simulation of lattice models. In this Colloquium the current status of area laws in these fields is reviewed. Center stage is taken by rigorous results on lattice models in one and higher spatial dimensions. The differences and similarities between bosonic and fermionic models are stressed, area laws are related to the velocity of information propagation in quantum lattice models, and disordered systems, nonequilibrium situations, and topological entanglement entropies are discussed. These questions are considered in classical and quantum systems, in their ground and thermal states, for a variety of correlation measures. A significant proportion is devoted to the clear and quantitative connection between the entanglement content of states and the possibility of their efficient numerical simulation. Matrix-product states, higher-dimensional analogs, and variational sets from entanglement renormalization are also discussed and the paper is concluded by highlighting the implications of area laws on quantifying the effective degrees of freedom that need to be considered in simulations of quantum states. ...

In classical physics concepts of entropy quantify the

extent to which we are uncertain about the exact state of

a physical system at hand or, in other words, the amount

of information that is lacking to identify the microstate

of a system from all possibilities compatible with the

macrostate of the system. If we are not quite sure what

microstate of a system to expect, notions of entropy will

reflect this lack of knowledge. Randomness, after all, is

always and necessarily related to ignorance about the

state. ... In quantum mechanics positive entropies may arise

even without an objective lack of information. ...

In contrast to thermal states this entropy does not

originate from a lack of knowledge about the microstate

of the system. Even at zero temperature we encounter a

nonzero entropy. This entropy arises because of a fundamental

property of quantum mechanics: entanglement.

This quite intriguing trait of quantum mechanics

gives rise to correlations even in situations where the

randomness cannot be traced back to a mere lack of

knowledge. The mentioned quantity, the entropy of  a

subregion, is called entanglement entropy or geometric

entropy and in quantum information entropy of entanglement,

which represents an operationally defined entanglement

measure for pure states ...

one thinks less of detailed

properties, but is rather interested in the scaling of the

entanglement entropy when the distinguished region

grows in size. In fact, for quantum chains this scaling of

entanglement as genuine quantum correlations—a priori

very different from the scaling of two-point correlation

functions—reflects to a large extent the critical behavior

of the quantum many-body system, and shares some relationship

to conformal charges.

At first sight one might be tempted to think that the

entropy of a distinguished region I will always possess an

extensive character. Such a behavior is referred to as a

volume scaling and is observed for thermal states. Intriguingly,

for typical ground states, however, this is not

at all what one encounters: Instead, one typically finds

an area law, or an area law with a small often logarithmic

correction: This means that if one distinguishes a

region, the scaling of the entropy is merely linear in the

boundary area of the region. The entanglement entropy

is then said to fulfill an area law. It is the purpose of this

Colloquium to review studies on area laws and the scaling

of the entanglement entropy in a nontechnical manner."



The problem of why the Pioneer Anomaly's anomalous acceleration is the same order of magnitude as the square root of the dark energy density though in the wrong direction has bugged me since 2002. Of course both dark energy and dark matter density are same order of magnitude .73 vs .23 of critical density for a flat space universe found in the inflation model. But why large-scale cosmology shows up on the small scale of our solar system is the problem and why there is a hollow volume from Sun out to orbits of the outer planets is another mystery.





Influence of global cosmological expansion on local dynamics and kinematics

Matteo Carrera*

Institute of Physics, University of Freiburg, Hermann-Herder-Straße 3, D-79104 Freiburg,


Domenico Giulini

Institute for Theoretical Physics, University of Hanover, Appelstraße 2, D-30167 Hannover,


Published 28 January 2010

"Attempts to estimate the influence of global cosmological expansion on local systems are reviewed. Here “local” is taken to mean that the sizes of the considered systems are much smaller than cosmologically relevant scales. For example, such influences can affect orbital motions as well as configurations of compact objects, like black holes. Also discussed are how measurements based on the exchange of electromagnetic signals of distances, velocities, etc. of moving objects are influenced. As an application, orders of magnitude of such effects are compared with the scale set by the apparently anomalous acceleration of the Pioneer 10 and 11 spacecrafts, which is 10^−9 m/s^2. There is no reason to believe that the latter is of cosmological origin. However, the general problem of gaining a qualitative and quantitative understanding of how the cosmological dynamics influences local systems remains challenging, with only partial clues being so far provided by exact solutions to the field equations of general relativity."


That our dark energy future event horizon accelerating the expansion of the universe is the Wheeler-Feynman total absorber giving us Wheeler's IT FROM BIT with "retrocausality without retrocausality."

"In the direct action theory the EM fields of a single source are not exclusively retarded but are time-symmetric. The appearance of pure retardation is now explained as the result of interference by time-symmetric exchanges with the cosmological gravitational field. Just as the effect of a dielectric continuum can be regarded as the final result of a series of absorptions and re-emissions on the microscopic level [23], the macroscopic exchanges with the gravitational field implied by (34) can be interpreted likewise. If each exchange is subject to the constraint that it be time- symmetric, then the gravitational damping plays the same role as do the future absorbers in the Wheeler-Feynman theory. Anti-phase advanced waves from these exchanges arrive back at the current source to re-enforce the retarded component and cancel the advanced component. Consequently these proposed cosmological boundary conditions guarantee that every ‘photon’ (of which, strictly, there are now none) will be absorbed. The absorption is not by matter, but the whole system - which includes a term for the energy in the cosmological gravitational field. A ‘prediction’ of this implementation of the cosmological boundary condition is that, if the universe were not expanding, then there would be no apparent predominance of retarded radiation. Consequently the future state of the universe is felt in the present. If these arguments stand then the direct action theory is validated (and therefore preferred), and advanced potentials in the sense of (15) and (16) are ubiquitous."

The detection of dark energy shows that retarded light emitted toward the future is infinitely redshifted after traveling only a finite distance in our accelerating expanding space. Our future space time geometry is qualitatively different from our past inflation to hot Big Bang geometry, hence the asymmetric Arrow of Time explaining why we age as the universe gets bigger.

Ibison's last section - below is now obsolete because he was not aware of Tamara Davis's 2004 PhD in 2006 that solves his problem here:

Reverse Causation
"A few words on the relevance of advanced potentials to the theme of this conference: The use of the phrase ‘reverse causation’ implies that one can meaningfully (i.e. semantically, if not in practice physically) separate the notion of logical casualty from temporal ordering. In order to do that, one must be able to identify a (more or less universal) property that distinguishes between a cause and an effect that is not the temporal order. Some arguments have been given here in support of the rehabilitation of the advanced potential. If one wished to identify all currents as causes and all potentials as effects, then absorption of radiation is an example of reverse causation. Since the most mathematically efficient description of absorption is through (exclusively) advanced potentials (Eq. (8) with Aout ), one may choose to associate reverse causation with the predominance of advanced potentials in an appropriately defined maximally efficient description. But no connection with the flow of entropy has been established in this document. As a result of considerations in the section ‘The Cosmological Boundary Condition’, it is not clear that entropy necessarily increases in Cosmological time, even in the event that retarded potentials turn out to be predominant in the ‘most efficient’ description of EM processes."


Michael Ibison
Institute for Advanced Studies at Austin 4030 West Braker Lane, suite 300, Austin, Texas 78759, USA.
Submitted for publication in Proceedings of AAAS Conference on Reverse Causation, 2006.
Abstract. "Advanced electromagnetic potentials are indigenous to the classical Maxwell theory. Generally however they are deemed undesirable and are forcibly excluded, destroying the theory’s inherent time-symmetry. We investigate the reason for this, pointing out that it is not necessary and in some cases is counter-productive. We then focus on the direct-action theory in which the advanced and retarded contributions are present symmetrically, with no opportunity supplement the particular integral solution of the wave equation with an arbitrary complementary function. One then requires a plausible explanation for the observed broken symmetry that, commonly, is understood cannot be met by the Wheeler-Feynman mechanism because the necessary boundary condition cannot be satisfied in acceptable cosmologies. We take this opportunity to argue that the boundary condition is already met by all expanding cosmologies simply as a result of cosmological red-shift. A consequence is that the cosmological and thermodynamic arrows of time can be equated, the direct action version of EM is preferred, and that advanced potentials are ubiquitous."


Famous theorists are plagued by amateurs claiming to have found a 'theory of everything'. Gerardus 't Hooft strikes back with an online physics class.


Gerard 't Hooft's biography puts him in strong contention for the title of the father of modern theoretical physics. He won the Nobel Prize for Physics in 1999 for his elucidation of the mathematics that now underlies the Standard Model of particle physics.


Click here.

't Hooft also invented the world hologram idea (along with Susskind), but neither of them seem to know that the source of our observer-dependent world hologram's influence on our present must be in our future not in our past. The Wheeler-Feynman theory comes to the rescue because the hologram is also the perfect future absorber enforcing net retarded causality through subterranean retrocausality. In Wheeler's terms:

Retrocausality without retrocausality.

Nonlocality without nonlocality.

On May 12, 2010, at 2:26 PM, Mc wrote:

"The exponential metric is not Puthoff's idea. It was developed then truncated to fit Einstein's field equations. Even Einstein is quoted saying the exponential metric should of been used. The truncated metric gives an unphysical singularity" Rmc
Yes of course Dicke first wrote it in 1961, but Hal Puthoff has used it extensively in the late 90's in his index of refraction approach - which BTW I am now using Hal's idea to some extent in my idea to increase effective gravity field inside superconducting negative index of refraction metal-material fuselage to make the anti-gravity field for zero g-force warp drive propulsion in vacuum.
The basic coupling should be (index of refraction)^4GNewton/c(vacuum)^4
the superconductor slows light to a stop i.e. index >> 1 & the negative index means that static non-radiative EM fields, e.g. w = + 1/3 in Casimir quantum wells will antigravitate! That is a big bang for small buck. I mean good efficiency in bending spacetime for small amounts of applied EM energy density.
The singularity in the mainstream vacuum Kerr solution is not a problem. (So far no astrophysical charged black holes we don't need the Kerr-Newman solution there, we do need it of course as Bohm hidden variables (dual to strings) on the hadronic scale and perhaps as models for quarks.


On May 12, 2010, at 7:42 AM, Mc wrote:

If we use the exponential metric there are no Black Holes, but the problem with an exponential metric is it doesn't solve Einstein's field equations so the metric gets truncated. 

The problem with that is we wind up with singularities, a physical impossibility from the standpoint of good Physics. 

So, what's wrong with Einstein's field equations that it can't accept an exponential metric which Einstein thought was correct?" Rmc




Black holes are an observational fact.

The metric representations are relative to timelike congruence of ideal observers formally defined by tetrad gravity fields.
You can write Puthoff's SSS exponential metric in the static LNIF set of observers and then ask what kind of matter stress-energy tensor Tuv would give that solution according to Einstein's field equation
Guv + (8piG/c^4)Tuv = 0
so far no observational evidence that it's useful.
Detailed studies of radiation from accretion disks etc by Martin Rees British Astronomer Royal, Master of Trinity College Cambridge, Head of the Royal Society and his Institute for Theoretical Astronomy have proved conclusively the factual status of black holes essentially described by the Kerr solution to a good approximation.

I think the facts make it very clear that there is no acceptable alternative to Einstein's GR that fits the astrophysical observations. That's my opinion for the record. All the evidence you need you can find here

& here
This is not to exclude torsion field effects in devices using superconductors along the lines of Ray Chiao. 
Torsion is a natural extension of Einstein's basic theory viewed as a local gauge theory on the global symmetries of his special relativity.
On May 12, 2010, at 8:57 PM, Paul Zielinski wrote:

On Wed, May 12, 2010 at 7:19 PM, JACK SARFATTI <sarfatti@pacbell.net> wrote:
You can write any damn metric you like and then compute the energy tensor needed for that metric

"I think this is about an alternative set of field equations that is consistent with known observational data. It's not clear that Einstein's equations are the only ones that are mathematically admissible."
Who cares? All sorts of models are mathematically admissible, but are not useful for explaining and predicting real observations.

"It's not clear even in the context of Einstein's Riemannian model for gravity that Einstein's field equations  are the only ones that are consistent with observations."
The urls above clearly demonstrate that you are mistaken.
It's completely ignorant of real physics to say "Einstein was wrong" - that's really crackpot

"Einstein himself said that Einstein was wrong on fundamental issues Jack. For example, Einstein retreated from his early position that Poincare's ether was not a scientifically meaningful concept in physics. So I think  you are overplaying your hand here."
We have discussed this ad-nauseum you are, in my opinion, pulling Einstein's remarks out of context. Basically it's irrelevant what transient opinions the creators of their own theory may have had about their theories in the absence of relevant measurements and observations. Einstein died before the relevant technology had been developed to test his theory. The "ether" is a Red Herring. Depends what one means by "ether". Of course in quantum theory we have virtual particles inside the vacuum that is an "ether" that respects the local symmetries of GR in the appropriate limit. Sure you can think of the four spin 1 gravity tetrad Lorentz group vector fields e^I as fields on a locally flat Minkowski space where
ds^2 = (Minkowski)IJe^Ie^J = guv(accelerating local frame)e^ue^v
e^I(unaccelerating local frame) = (tetrad)^Iue^v(accelerating locally coincident frame)
In addition, the IT tetrads e^I are Bell pair spinor qubit entangled states in the quantum informational pre-geometry
e^I = (Newman-Penrose)^Iii'(Qubit)^i(Qubit')^i'

"How about black holes? Einstein said he didn't believe in them. You say that black holes are an observational  fact. Doesn't that make Einstein wrong, according to your own arguments?"
All irrelevant. If Einstein were alive today he would not dispute the reality of black holes.

On May 12, 2010, at 8:57 PM, Paul Zielinski wrote:

How about black holes? Einstein said he didn't believe in them. You say that black holes are an observational 
fact. Doesn't that make Einstein wrong, according to your own arguments?

All irrelevant. If Einstein were alive today he would not dispute the reality of black holes.

"The point here is if you are right about black holes, then you are basically saying that Einstein was wrong on a major  issue in gravitational theory, since as a matter of historical fact he rejected them as physical solutions to the 1916
field equations.. 

Does that make you a 'crackpot'? "

Einstein did not understand the full implications of the solutions of his field equations
no one did until Penrose & Hawking's global light cone techniques more than 5 years after Einstein died - assuming classical positive energy conditions violated in quantum theory of course. Paul your way of looking at theory is very bizarre in my opinion. You draw silly conclusions from innocuous historical accidents. 
What matters is
Guv + kTuv = 0
one can make singularity-free dark star solutions of them invoking dark energy / > 0 in the interior - so what?
From the outside it looks just like a black hole.
Your point is a quibble in my opinion.
On May 13, 2010, at 7:13 AM, Paul Murad wrote:

"Sorry but I have to agree with Paul.  Your first comment was in response to an individual that indicated we need to keep an open mind and perhaps review what we know about the situation.
It was uncalled for that you would characterize him as a crackpot.  There is nothing wrong with saying we should go back and question the conventional wisdom because we may gain some new or unknown insights."
Paul M
Anyone who does not accept Einstein's theory of gravity as the only viable theory that fits observation in its proper domain of validity is ipso-facto a physics illiterate crank in my book. That's my opinion. I gave the reference from Cliff Will that supports my case. Alternative theories of gravity have the same status as the Nazi hollow Earth theory in my opinion.
Now this does not exclude the natural extensions of Einstein's 1916 GR e.g. adding "square root" tetrads, torsion, spinors etc - an experimental issue that does not contradict the basic principles of Einstein's theory, especially when viewed in terms of the deep principle of local gauge invariance.


May 11

Cloning Pointer Quantum States?

Posted by: JackSarfatti |
Tagged in: Untagged 


Decoherence, einselection, and the quantum origins of the classical

Wojciech Hubert Zurek Theory Division, LANL, Mail Stop B210, Los Alamos, New Mexico 87545

(Published 22 May 2003 Reviews of Modern Physics)
"The manner in which states of some quantum systems become effectively classical is of great significance for the foundations of quantum physics, as well as for problems of practical interest such as quantum engineering. In the past two decades it has become increasingly clear that many (perhaps all) of the symptoms of classicality can be induced in quantum systems by their environments. Thus decoherence is caused by the interaction in which the environment in effect monitors certain observables of the system, destroying coherence between the pointer states corresponding to their eigenvalues. This leads to environment-induced superselection or einselection, a quantum process associated with selective loss of information. Einselected pointer states are stable. They can retain correlations with the rest of the universe in spite of the environment. Einselection enforces classicality by imposing an effective ban on the vast majority of the Hilbert space, eliminating especially the flagrantly nonlocal ‘‘Schrodinger-cat states.’’ The classical structure of phase space emerges from the quantum Hilbert space in the appropriate macroscopic limit. Combination of einselection with dynamics leads to the idealizations of a point and of a classical trajectory. In measurements, einselection replaces quantum entanglement between the apparatus and the measured system with the classical correlation. Only the preferred pointer observable of the apparatus can store information that has predictive power. When the measured quantum system is microscopic and isolated, this restriction on the predictive utility of its correlations with the macroscopic apparatus results in the effective ‘‘collapse of the wave packet.’’ The existential interpretation implied by einselection regards observers as open quantum systems, distinguished only by their ability to acquire, store, and process information. Spreading of the correlations with the effectively classical pointer states throughout the environment allows one to understand ‘‘classical reality’’ as a property based on the relatively objective existence of the einselected states. Effectively classical pointer states can be ‘‘found out’’ without being re-prepared, e.g, by intercepting the information already present in the environment. The redundancy of the records of pointer states in the environment (which can be thought of as their ‘‘fitness’’ in the Darwinian sense) is a measure of their classicality. A new symmetry appears in this setting. Environment-assisted invariance or envariance sheds new light on the nature of ignorance of the state of the system due to quantum correlations with the environment and leads to Born’s rules and to reduced density matrices, ultimately justifying basic principles of the program of decoherence and einselection."
Simple eh? ;-)
"How the classical world arises from an ultimately quantum substrate has been a question since the advent of quantum mechanics [17]. Decoherence is now commonly used to study this quantum-classical transition [810]. Its theory, however, treats the environment as a sink where information about the system gets lost forever. Yet the information deposited in the environment can be intercepted, and it is our primary source of information about the Universe. Indeed, decohering interactions with the environment can amplify and store an impression of the system. Amplification was invoked already by Bohr [11] in the context of measurements. Early [12], as well as more recent [9,13,14], discussions of decoherence note the importance of redundancy, and provide an information- theoretic framework for how the environment acts as an amplifier and as a source of information about the ‘‘system of interest’’ [1519].

Quantum Darwinism reflects this new focus on the environment as a communication channel [1517]. When one receives a fragment of the environment by, for instance, intercepting with one’s eyes a portion of photons that are scattered off a system of interest (e.g., the text of this Letter), one acquires information about it. Previous studies found that, with an initially pure environment, one can acquire information about the preferred observables of the system even from small environment fragments [17]. This explains the emergence of objectivity, as it allows many initially ignorant observers to independently obtain nearly complete information and reach consensus about the state of the system by intercepting different fragments of the environment. Classicality of states can now be quantified in terms of the redundancy of information transferred to and recorded by the environment. However, it is unclear how well one can accumulate information starting with a mixed, or hazy, environment, such as one at finite temperature. Yet the photon environment that is responsible for the vast majority of the information we gain has precisely such a hazy character. This Letter shows that even hazy environments will, in the end, communicate a very clear image."

PRL 103, 110402 (2009)
Quantum Darwinism in a Mixed Environment
Michael Zwolak, H. T. Quan, and Wojciech H. Zurek
Theoretical Division, MS-B213, Los Alamos National Laboratory, Los Alamos, New Mexico 87545, USA
(Received 29 April 2009; revised manuscript received 8 August 2009; published 8 September 2009)


May 11

Question for Nick Herbert

Posted by: JackSarfatti |
Tagged in: Untagged 

Why doesn't W. Zurek's quantum Darwinism violate the no-cloning a quantum theorem?

"The basis of almost any theoretical quantum-to-classical transition lies in the concept of decoherence. In the quantum world, many possible quantum states “collapse” into a single state due to interactions with the environment. To quantum Darwinists, decoherence is a selection process, and the final, stable state is called a “pointer state.” Although pointer states are quantum states, they are “fit enough” to be transmitted through the environment without collapsing and can then make copies of themselves that can be observed on the macroscopic scale. Although everything in our world is quantum at its core, our classical view of the universe is ultimately determined by these pointer states." Physics Org

New evidence for quantum Darwinism found in quantum dots

Is this the answer? i.e. "pointer states" must be pairwise orthogonal for a "good measurement".

Non-clonability can be seen as a property of arbitrary sets of quantum states. If we know that a system's state is one of the states in some set S, but we do not know which one, can we prepare another system in the same state? If the elements of S are pairwise orthogonalthe answer is always yes: for any such set there exists a measurement which will ascertain the exact state of the system without disturbing it, and once we know the state we can prepare another system in the same state. If S contains two elements that are not pairwise orthogonal (in particular, the set of all quantum states includes such pairs) then an argument like that given above shows that the answer is no.