Text Size


Oct 04

Subject: Re: Comments on Woodward's Star Gate Review Article - space-time foam
"The customary way of making things larger is to blow them up.  This is usually done by adding energy, sometimes explosively, to the object to be enlarged.  It is worth noting that this process works for blowing things up in spacetime.  Whether it will work for blowing up spacetime itself is another matter.  The size of the wormholes of the putative quantum spacetime foam presumably are about the Planck length large – that is, about 10-33 cm across.  This is about 20 orders of magnitude smaller than the classical electron radius and 18 orders of magnitude smaller than the diameter of nuclei.  How do you blow up something so fantastically small?  By smoking everything in its environs along with it.  The two tools at our disposal to attempt this are the Relativistic Heavy Ion Collider (RHIC) and the Large Hadron Collider (LHC).  Taking these devices (optimistically) to put about 10 Tev per interaction event at our disposal, we find in equivalent mass that gives us about 10-20 gm.  This is roughly 15 orders of magnitude less that the mass of our transient wormholes to be ablated.  Given present and foreseeable technology, this scheme is impossible.  Moreover, the transient wormholes only exist for 10-43 seconds, making the temporal interaction cross-section hopelessly small." - JW
James's earlier remark that there may not be a spacetime foam is well taken. If gravity is an emergent low energy effective c-number field theory from the QCD/SU3 - weak SU2 vacuum superconductor then probably there isn't.  However, in string-brane theory, gravity as a microscopic field has a Yukawa small scale metric like in Salam's 1973 f-gravity e.g.
G* = GNewton( 1 + ae^-r/b)
a ~ 10^40
b ~ 10^-14 cm? maybe 10^-16 cm
Therefore the effective Planck length at small scales would be LP* >> LP maybe as large as 10^-11 cm even larger?


This dramatically lowers the energies needed. The LHC data should give us information on this.
Gravity in large extra dimensions
n 1998, Nima Arkani-Hamed found himself pondering one of the conundrums of modern physics: why is gravity so much weaker than the other fundamental forces?
www.eurekalert.org/features/doe/2001.../dbnl-gil053102.php - The search for extra dimensions - physicsworld.com
The only force we can use to probe gravity-only extra dimensions is, of course, gravity itself. Remarkably we have almost no knowledge of gravity at ...
physicsworld.com/cws/article/print/403 - Similar
symmetry - December 2005/January 2006 - the search for extra ...
It is this very idea that has led theorists to predict the existence of extra dimensions.Gravity, they postulate, exists in more dimensions than we do, ...
www.symmetrymagazine.org/cms/?pid=1000237 - Search for Extra Dimensions
Well, in fact we could “feel” these extra dimensions through their effect on gravity. While the forces that hold our world together (electromagnetic, weak, ...
d0server1.fnal.gov/users/gll/public/edpublic.htm - Cached - Similar
Gravity in Extra-dimensions, Manyfold and Pre-Big Bang Universe
In fact the LHC now becomes a quantum-gravity machine, which can look into these extra dimensions of space through apparent violations of energy ...
universe-review.ca/R15-16-manyfoldu.htm - Cached - Similar
Large extra dimension - Wikipedia, the free encyclopedia
In particle physics, the ADD model, also known as the model with large extra dimensions, is an alternative scenario to explain the weakness of gravity ...
en.wikipedia.org/wiki/Large_extra_dimension - Cached - Similar
[1005.3220] The effect of extra dimensions on gravity wave bursts ...
by E O'Callaghan - 2010 - Related articles
May 18, 2010 ... Abstract: We explore the kinematical effect of having extra dimensions on the gravity wave emission from cosmic strings. ...
arxiv.org/abs/1005.3220 -
String theory motivated a breakthrough in gravity and extra dimensions (1990s). Gravity(closed strings) propagate the extra dimensions ...
www.aps.org/meetings/multimedia/april2007/upload/Y4_2.ppt [PDF] Extra Dimensions, Dark Energy, and the Gravitational Inverse ...
by LJ Furniss - 2008 - Related articles
Keywords: Gravity, Extra Dimensions, Inverse-square Law, Dark Energy. 1. Introduction and Motivation. Theoretical speculations that seek to solve the ...
Extra Dimensions in Newtonian Gravity « WeirdSciences
Oct 29, 2009 ... Extra noncompact dimensions would change the force law of gravity away from being the inverse square law that has been and still is measured ...

On Oct 4, 2010, at 9:48 AM, JACK SARFATTI wrote:

"That means, in turn, for example, that the mass of silica (glass) should be different from the masses of its silicon and oxygen constituents measured separately.  This is not a small effect.  Need I say, there is no evidence whatsoever that the masses of compounds depend on their indexes of refraction?" - JW

Jim's remark is not true if you put in the numbers and even accept his assumptions. Indices of refraction n of common materials are close to 1 even if as high as 10 the effect would be very tiny unless extraordinary attempt was made to detect it. Of course, no one has! Might be a good idea to look!

Take any material in the lab compute its total energy E, then multiply it by  n^4G/c^4 to get an induced curvature - it will be tiny compared to the Earth's curvature.
Indeed, the induced curvature at Schwarzschild coordinate r outside of E
1/(radius of induced curvature)^2 ~ (n^4GE/c^4)r^-3
Furthermore, the dispersion in n needs to be included in any actual attempt to measure anomalous gravity. Basic point is that the effect is very tiny under normal conditions.
Lab objects are still test particles in the Earth's gravity field - their back-reaction is still relatively tiny.

It may be possible to detect this predicted anomalous gravity in short-lived atomic gas Bose-Einstein condensates and even in laser beams. Obviously no one has tried to because no one, until now, has even conceived of this possibility that needs to be checked out.


Superfluid helium "appears" to defy gravity - hmmm... "appears"?

Let us also not forget Ray Chiao's theory of electromagnetic-gravity wave transduction in a superconductor as well as Giovanni Modanese's model all suggesting anomalous gravity in Bose-Einstein condensates.
Refractive Index of Fe2O3, Iron Oxide
The refractive index (index of refraction) and extinction coefficient at 632.8 nm are 2.918 and 0.029. Below are files of complete refractive index and extinction coefficients.
Material    Index   
Air at STP   
Water at 20 C   
Ethyl alcohol   
Sugar solution(30%)   
Fused quartz   
Sugar solution (80%)   
Typical crown glass   
Crown glasses   
Spectacle crown, C-1   
Sodium chloride   
Carbon disulfide   
Flint glasses   
Heavy flint glass   
Extra dense flint, EDF-3   
Methylene iodide   
Rare earth flint   
Lanthanum flint   
Arsenic trisulfide glass   
On Oct 3, 2010, at 10:39 PM, JACK SARFATTI wrote:

PPS string theory has been attacked as not being background independent yet others say it's the only good theory for quantum gravity.

My point is that none of these key issues are well understood even by the experts and to hoist Jim by his own petard ;-) I quote his own very good points with which I agree:

"And you need a Jupiter mass – 2 X 1027 kg – concentrated in a region of small dimensions. A simple calculation assuming a throat diameter of, say, 10 meters and a wall thickness of a meter or so, leads to an exotic density of, ~ 1022 gm/cm3, that is, on the order of seven orders of magnitude greater than nuclear density.  Similar densities, as a matter of idle interest, are required to make warp drives.

Faced with this fact, several responses are possible.  Thorne’s recently expressed view is that the engineering of  time machines (enabled by stargates) will require a profound understanding of the elusive, yet to be created theory of quantum gravity – if it can be done at all.  And that may require evolution on our part comparable to the evolutionary distance between amoeba and us.  That is, asking us to invent time machines is like asking amoeba to invent jet aircraft.  This would mean that the clever aliens who have purportedly already done this are hundreds of millions of years more evolved than are we.  A simpler, equivalent stance is to assert that anyone trying to make a stargate is an idiot.  To propose that clever aliens have succeeded in this seemingly impossible task is just plain stupid.  Working on the problem is a waste of time.  Those of this view may be right.  But if we adopt this view and abandon all work on the problem, we surely have no chance, however small, of succeeding.

  Another approach to the problem of making stargates is to turn Thorne’s heuristic question around.  Instead of asking what constraints the laws of physics place on clever aliens, we propose that clever aliens have made stargates and ask how, with plausible physics, might they have done so?  Of course, allowing physics other than that endorsed by the mainstream opens the door to all sorts of excesses and stupidities.  But if only mainstream physics were ever allowed, no progress on much of anything really interesting would ever be made.  Arguably, a heuristic exercise of this sort has the same sort of value as that of Thorne and his graduate students years ago in that it illuminates what must be done to achieve the goal of stargates, thus shedding light on the requisite physics and thus on whether they will ever be made." -- JW

On Oct 3, 2010, at 10:29 PM, JACK SARFATTI wrote:
Salam's f-gravity is different from Einstein's gravity from a massless spin 2 graviton. In the former, it is due to a massive spin 2 meson.
On Oct 3, 2010, at 10:14 PM, JACK SARFATTI wrote:
"This means, if Equation (8) is correct, that the masses of things should depend on their indexes of refraction.  That means, in turn, for example, that the mass of silica (glass) should be different from the masses of its silicon and oxygen constituents measured separately.  This is not a small effect.  Need I say, there is no evidence whatsoever that the masses of compounds depend on their indexes of refraction?" -- JW

The above argument is also questionable because the rest masses of electrons, nucleons etc. are determined from a completely different scale by the Higgs field Yukawa couplings for the leptons and quarks and by the zero point kinetic energy of the confined quarks for the nucleons as given in QCD. The index of refraction is a low energy emergent collective property of many electrons and atoms and indeed gravity itself is probably emergent. For warp drive and star gates we only need gravity in the low energy regime not at the small scale of the Compton wavelength of the electron for example.

On Oct 3, 2010, at 9:52 PM, JACK SARFATTI wrote:
"Working back through the argument, the first thing to note is that the substitution that leads to Equation (8) is not valid.  Neither is it supported by the facts of observation.  It’s not valid because the polarizable vacuum model is not background independent, and any plausible theory that is physically equivalent to general relativity must be background independent." -- JW

Jim's basic error is to make apriori arguments on an empirical issue.

There are no relevant facts of observation here apart from flying saucers, which, if they are real, refute Jim's argument by his own admission - see above.

I mean no one has yet made an ultra high temperature superconducting meta-material in which n ~ 10^10 with both negative permittivity and permeability in some range of frequencies and wave vectors for non-radiative near EM fields.

Yet, there are tantalizing controversial claims of anomalous gravity from superconductors.

Second, the appeal to background independence is spurious as is well known from the fact, as an example, that crystals spontaneously break translational symmetry. In the same way, we can have a spontaneous breakdown of background independence in the superconductor ground state without violating the background independence of the Einstein field equations. James has confused the symmetries of the field equations with the symmetries of their solutions!

One can see the error in James's argument by looking at the standard cosmological solution in which there is a preferred foliation in which the Cosmic Microwave Background (CMB) is maximally isotropic - even though the field equations have no preferred global frame. The temperature of the CMB is an objective measure of global cosmic time.

That said, Jim's conclusion in the end may be correct as I have always said, but that is an empirical issue.

More comments later as I am watching "Rubicon" on TV. ;-)


A brief history of my contributions to acausality in quantum physics


Fred Alan Wolf

Have Brains / Will Travel:
a Global Quantum Physics Educational Company,

San Francisco CA, USA

Professor Jack Sarfatti asked me to provide an account of my contribution to what we may call acausality as it occurs in quantum physics. My first inklings about this subject occurred in 1983 when in preparing my book[1] Star*Wave for publication, I read John Cramer’s interesting paper[2] on the subject of backwards-through-time quantum wave function, u* acting as a modulation of the normal forward-through-time time quantum wave function u in standard nonrelativistic quantum physics.  Cramer’s account was a new way to interpret quantum physics and gave a plausible reason for calculating probabilities from the multiplication of u by u*.  In Cramer’s account the forward-through-time u was considered as an offer wave initiating at a point i where an event had occurred.  The backwards-through-time u* was likewise an echo wave initiated by a final event f traveling backwards-through-time to the offer event and thus completing a cycle.

What was in my mind, at that time, incomplete in Cramer’s account was that even though this gave the correct mathematical calculation of the probability in quantum physics describing the probable causal connection between i and f, it in no way described how the sequence i to f was made into an actual pairing of two real events.  If for example there were two possible final events f1 and f2 one would have both probabilities i to f1 and i to f2 to take into account without actually knowing which final event f1 or f2 actually occurred.

It seemed to me at the time that one should know what takes place if both the initial and final events are determined. This led me to further investigate the connection between human consciousness and quantum physical probabilities.

Consequently when I finished writing Star*Wave I had included a discussion of Cramer’s transactional interpretation (TI) as outlined in his paper as a possible mechanism for the way the mind works in relation to the brain.

A few years later I published The Body Quantum.[3] In chapter 25 I once again looked into the transactional interpretation.  I wrote:

Whenever atoms are arranged in a highly repetitive pattern, such as those found in crystals or in the long strands of molecular DNA, the quantum wave functions also take on a similar pattern.  This pattern constitutes a continual kind of observation in which the quantum wave function, in a sense, is observing itself over and over again.  Quantum waves and quantum wave functions can be imagined as constrained by such a pattern, which, in fact, gives the structure its stability.

The quantum wave function, in my view, turns on and off through the observer effect.  When an observations occurs, the quantum wave function “pops,” and a pointlike atom, or part of an atom, is manifested for an instant.  When no observation takes place, the quantum wave function “hangs around,” like a ghost, in the same locale in which it first popped.  This sequence is highly reinforced by the repeating structure.

To try to imagine this concept is difficult because there are many atoms involved.  The quantum wave functions, as I imagine it, are “resonating” with the structure of the molecules, so that each quantum wave function turns on and off with many oscillations.  From the solid molecule’s point of view, this corresponds to its own self-observation.

This viewpoint can be contrasted with a single atom’s self-observation: It, too, can be thought of as being in a self-observation pattern, wherein its quantum wave function turns on and off.  But being an isolated atom means that the pattern will display a higher degree of randomness.  At the atomic level, this pattern appears as the atom itself, vanishing and reappearing in a sequence of random points, blurring, more or less, into a solid object. 

Thus, each quantum wave function pattern is highly specific to the element it represents.  A quantum wave function for the hydrogen atom is quite different in, detail from the quantum wave function of a carbon atom.

When a sugar-phosphate molecule repeats itself as an endless chain of snakelike strands, winding around each other much like a spiral staircase, an infinite hall-of-mirrors effect manifests itself, allowing the living, conscious molecule to appear.  I am describing, of course, the molecule of genetic life, deoxyribonucleic acid, or DNA.

The second idea is even stranger and more speculative.  There are actually two quantum wave functions involved in a quantum wave function pop, the second of which, the star quantum wave function or star wave (as I referred to it in my previous book[4]), is similar in form to the ordinary quantum wave function, only orientated backward in space-time.  Thus, an ordinary quantum wave function, u, moving from here-now to there-then, is met by a star quantum wave function, u*, from the there-then moving toward the here-now.  These quantum wave functions multiply together, yielding the product u*u; that is, u* multiplying u. Now, it is not speculation that one must multiply ordinary quantum wave function u by its star quantum wave function u* in order to calculate the relative probability that quantum wave function events will occur; that is exactly what quantum physicists do when they determine the likelihood that any event will occur.  The speculation surrounds the idea that u* comes from the future, traveling backward through time, much like the wave that, bouncing off the shore, travels back toward the source of the wave.  I wasn’t able justify this idea by any physical experiment, at least not at that time.

I believed this idea is important because it could explain how the evolution of anything can take place.  My idea is similar to those that Sir Fred Hoyle discusses in his book The Intelligent Universe.[5]  Merely left to the odds, it is extremely unlikely that anything as orderly as a human being would arise at all simply from random processes.  As I explained in Chapter 11, of the body quantum there needs to be some form of intelligence involved.  But the question is, how does that intelligence act?  Of course, I could just postulate that there is a Supreme Intelligence and that this being can act in any way that it sees fit.  As Niels Bohr once remarked to Albert Einstein, when he was trying to figure out how God did it, “Stop telling God what to do.”

I certainly don’t want to do that! But I do want to know how God does it.  Yet, as a physicist, I am somewhat constrained: I can’t postulate just any idea, because a scientific idea, in order to be considered valid, must fit with what we already know (or, at least, “think” we know).  The idea that W* comes from the future may just save the day, however.  As Hoyle puts it:

If events could operate not only from past to future, but also from future to past, the seemingly intractable problem of quantum uncertainty could be solved.  Instead of living matter becoming more and more disorganized, it could react to quantum signals from the future—the information necessary for the development of life.  Instead of the Universe committed to increasing disorder and decay, the opposite could then be true.  (Hoyle, p.  213.)


In a highly organized material containing repeating patterns, the u*u content becomes highly repetitive, producing a probability pattern of reinforced strength.  Thus, crystals of repeating materials, such as sodium chloride, carbon lattices (such as diamond), and other single crystals of metals and metals in combinations with other substances, possess great strength or other unusual properties.

In DNA we have a similar phenomenon of great repetition, with complex patterns of sugar-phosphate backbones interrupted by the much longer, seemingly random steps of base pairs linked :together in complementary codes.  These bases, you’ll recall, occur four types: A, C, G, T.

Here a third idea surfaces: Because of the repetition of the DNA structure, the likelihood of a repeating u*u pattern is highly enhanced, with the u* involved propagating from a near future to the present.  The signal from the future is more or less the same as that from the past, and the pattern, consequently, tends toward stability.  The more stable the pattern, the less likely that the distant future will disturb it.  Again, the idea that there exists a resonance between the quantum wave function and its structure—involving both the past and the future—is at play here.  Signals from the distant future do arrive, however; without them, DNA would never alter its patterns.  But the more stable the reinforcement brought on by the repetition of the strand, the smaller the disturbance produced.  It is the interplay of the endless crystalline repetition of the DNA strands, twisting in space and dancing in time as vibrations with the almost though not quite random patterns of A, C, G, and T bases, that produces stable animal and plant consciousness.  Consciousness, as we commonly experience it, thereby emerges as a consequence of the quantum wave function vibration patterns associated with DNA vibration patterns repeating and resonating with both the future and the past.

Molecules of DNA within shouting distance of each other also vibrate, sending quantum semaphore messages back and forth, and in this manner a resonance arises between neighboring molecules.  This resonance is much like any other resonance phenomenon, such as a building’s vibrations in the wind or the rolling of a ship on the high seas.  With energy being fed from one molecule to the other at just the right frequency to induce the other molecule to respond, the two resonate together.  It is this resonance of waves in different cells that could result in the healing of the cells.

Illness could result from an opposite effect.  When molecules are off-resonance, they fail to communicate with each other; such off-resonance could arise from atomic changes in the molecules or from subtle changes in the probability patterns of the quantum wave functions, possibly brought on by negative thinking.  Influenced by such thinking, perhaps molecules tend to isolate themselves, forming self-contained units of limited capacity.  Such molecular isolation can be understood in terms of our own behavior when we feel depressed or unduly anxious about something, and want to be alone in our misery.

Consequently, illness and negative thinking could create molecular islands of separation within our cells.  Healing energy counters this separation tendency by fostering correlations between molecules: One molecule heals another.

   A few years later I looked once again at this issue acausality in my paper published in the Journal of Theoretical Biology[6] entitled “On the Quantum Physical Theory of Subjective Antedating.” In this paper I was still wondering how quantum physical probabilities were related to conscious human experience and I found a possible relationship. In brief: a single conscious instant requires in the physical brain both the occurrence of a past and a future event. In this paper I explained if as follows:

Assuming that neuronal adequacy and experience were one and the same, Libet (1985)[7] pointed out the obvious discrepancy between the time of the experience of an event—the subjective referral—and the time of neuronal adequacy required to experience the event. I suggested an alternative proposal. Neuronal adequacy and subjective experience were not one and the same events. Neither were peripheral stimulation and subjective experience one and the same events, even though they seemed to be. The truth actually lies somewhere in-between. Both the stimulation and neuronal adequacy (two events) are needed for the conscious (one event) experience, even though the time of that experience is referred back to the peripheral sensation.

This proposal was based on ideas put forward in my Star*Wave book (1984) and upon the new TI of quantum mechanics stated by Cramer (1983, 1986). According to the TI, a future event and a present event are involved in a transaction wherein a real quantum wave of probability (retarded wave), u, called the “offer” wave, issues from the present event to the future event. The future event is then stimulated to send back through time an “echo” wave (advanced wave), u*, towards the present event. The echo wave is the complex conjugate of the offer wave.

According to the rules of quantum mechanics, the probability distribution (probability per unit volume) for an event to occur, is given by u*u. However, no other interpretation explains how this product arises physically. Following the TI, the echo wave modulates the offer wave thus producing the required u*u probability pattern. Thus, it is necessary for future events to influence present or past events by sending back into time a corresponding echo wave, u*, following an offer wave, u, from the present or past. Specifically, the echo wave contains the complex conjugate reflection of the offer wave multiplying the offer wave in much the same manner as an audio wave modulates a high frequency carrier signal in radio broadcasting. The probability, u*u, which then results in a probability for a transaction—a correlation between the two events—arises as a probability field at the initial event.

The events in question (stimulation and neuronal adequacy) are time-like separated. Thus, according to the theory of relativity, there is no way that these two events could be ever be simultaneously observable as was postulated by Snyder. Hence his attempt to resolve the paradox was not tenable.[8] However, the fact that the observer of those events sees them as simultaneous means that his mind acts as a kind of “time machine”. That is, the experience of the event is “projected” back in time towards the occurrence of the sensation.

Libet had suggested that this may be an illusion, that the real “recognition” of the event only occurred later at the time of neuronal adequacy and that the subject “subjectively” and mistakenly remembered the event as having occurred earlier. Whether or not this is an illusion is at present not experimentally testable. In any case, one must wonder why subjects believe that their knowledge or recognition of an event is a simultaneous occurrence with the event if indeed the knowing and the stimulation are time-like separated.

I believed this backwards-in-time projection between a neural event and a stimulus can be consistently accounted for using the TI of quantum mechanics. In this case, the present event (the peripheral sensation), S, sends a forward-through-time probability offer wave to a future event (neuronal adequacy), N. Most likely, N lies on the area of the cortex normally associated with the sensation. The future event, N, sends a backwards-through-time probability echo wave to the present event, S.

According to the TI, the S to N offer wave stimulates the N to S echo wave. The N to S echo wave then carries a replica of the S to N offer wave back towards the original stimulation. The N to S echo wave arriving at the location of the source, S, is the probability for the correlation of the events. If the two waves ‘resonate,’ meaning that the probability for the S to N correlation is large, then a significant probability for the two events is achieved. In this manner, that which is significantly measurable—has the largest probability—is also that which is brought to conscious awareness. In my view then, all possible future events are in contact with a present event, however, the most probable future events are those that produce the largest value of u*u, and consequently constitute an event in consciousness.

I offered the hypothesis that whenever two events are so correlated, i.e., the probability for the events is not a priori zero, they will be experienced as one and the same event. I suggested that this means, in general, that any two quantum physically correlated events separated in time or space will constitute a single experience.

Upon further thought I realized that although Cramer’s hypothesis took care of providing a probability distribution surrounding in spacetime the offer event it still did not account for which final event corresponded to human experience. Years later in 1998 I looked farther into the work of Libet and his co-workers.[9] Aharonov and his coworkers had been working on a scheme to represent quantum physics in terms of two-time quantum wave functions—that is one not only specified the initial condition of the quantum wave function but also its final state as well. Consequently I based my new work on the work of Aharonov and his co-workers. I wrote the introduction and conclusion of the paper as follows:

I offer a quantum physical resolution similar to that of the Wheeler delayed choice experiment in quantum physics of the delay-and-antedating hypothesis/paradox put forward by Libet et al. (1979) to explain certain temporal anomalies associated with passive perception. I propose a model wherein two neural events cause backward-through-time and forward-through-time neurological signaling in accordance with wave function collapse in the intervening space-time interval. Pairs of causality-violating events must occur in the brain in order that a single experience in consciousness occurs. The model offers a first step towards the development of a quantum physical theory of subjective awareness and suggests that biological systems evolve and continue to function in accordance with a causality-violating, two-valued, transactional model of quantum mechanics. The model makes a new prediction about the timings of passive bodily sensory experiences and imagined or phantom sensory experiences. The predictions of the model are compared with experimental data indicating agreement and new experiments are proposed testing the model.

In his recent book Penrose (1989)[10] poses the paradox of the relationship of awareness and physical events that elicit it as follows, “Is there really an ‘actual time’ at which a conscious experience does take place, where that particular ‘time of experience’ must precede the time of any effect of a ‘free-willed response’ to that experience?... If consciousness... cannot be understood... without...quantum theory then it might... be... that... our conclusions about causality, non-locality, and contrafactuality are incorrect.”

Penrose believed that there are reasons for being suspicious of our physical notions of time in relation to physics whenever quantum non-locality and contrafactuality are involved.

I would add that the same thing must be said with regard to consciousness. He suggests that if, in some manifestation of consciousness, classical reasoning about the temporal ordering of events leads us to a contradictory conclusion, then this is strong indication that quantum actions are indeed at work!

In this paper I examined a quantum theory of the relationship between the awareness of timings of events and their corresponding physical correlates and showed that indeed not only are quantum actions at work, they are indispensable in explaining the temporal paradoxes inherent in the phenomena.

I concluded my article (1998) with these comments:

When it comes to time in physics, we are somewhat at a loss. All of our equations are unique in one very real sense, there is no specific order to the sequences of events we label as the passage of time. Both Newtonian physics and quantum physics share this apparent fault in disagreement with our common sense experiences. We could just as well write equations and set up appropriate spatial and temporal boundary conditions of retrodiction in place of prediction and feel equally satisfied that we had the correct equations. Indeed, if we do simple enough experiments we find that retrodicting is as good as predicting when it comes to determining what shall be happening in the next sequence of events either following or preceding.

In life, with all of its complexity and its ultimate human measure, time marches on. Fallen cracked eggs do not jump off the floor into our outstretched hands. Dead loved ones do not reconstitute themselves and resurrect. We grow older each day not younger. How are we to ever explain this scientifically and fundamentally? It would seem that we are missing something essential when it comes to time.

Two bits of data we know. Conscious experience of events and the second law of thermodynamics. The first bit is subjective in its context while the second is purely objective. We certainly know that we can think a thought, write a sentence, and find the words are uniquely time ordered. We certainly know of the fact that hot bodies cool down and cold bodies warm up. Is there some connection between these data bits?

So far we have no theory that connects them. While much as been done in the objective arena to connect thermodynamics and statistical mechanics to quantum mechanics, even some remarkably clever insights, we still do not have a fundamental theory connecting them. Given Planck’s constant, the speed of light, the gravitational constant, and the mass of any particle you wish to mention, we cannot derive Boltzmann’s remarkable constant of nature.

In the world of subjective experience very little has been done by physicists and for probably very good reason; no one knows what to do, what to measure, or even if it is ethical to perform such measurements (usually involving the human brain) even if we knew what we were looking for. Here Libet’s remarkable experiments need special mention. At least in them we are provided with a clue concerning subjective time order. Perhaps there is something fundamental in the notion that our equations are not time order unique and the theory given here that according to subjective experience we need two or more separate events to have a single perception. Perhaps this theory that a perceived event requires information flowing from end points coming before it and after it, much like a stringed musical instrument requires information coming from its nodal end points to set up standing wave patterns of musical harmony, is a fundamental requirement for both time order uniqueness and subjective experience.

It would seem to me that now we need to look toward altering our concept of time in some manner, not that this is an easy thing to do. Perhaps we should begin with the idea that a single event in time is really as meaningless as a single event in space or a single velocity. Meaningful relation arises as a correspondence, a relationship with some reference object. Hence an object’s velocity is meaningfully measurable with respect to another object’s velocity as the relative velocity between them. In a similar manner as I point out in this paper, the timing of an event is also only meaningful in reference to another timing event. When the end points or reference times for the events are not specified, then only the relative interval becomes relevant. When that interval lies within the limitation of quantum uncertainty, the event referred to within the interval must also lie within that uncertainty. Failure to note this leads to apparent timing paradoxes.

The resolution of temporal paradoxes particularly as they show themselves in future quantum physical objective experiments and in subjective timing experiments will continue to require a new vision of time. Perhaps this paper will assist us in our search for a new theory of time.

According to Aharonov:

Until now we have limited ourselves to the possibility of two boundary conditions which obtain their assignment due to selections made before and after a measurement. It is feasible and even suggestive to consider an extension of quantum mechanics to include both a wavefunction arriving from the past and a second ‘destiny’ wavefunction coming from the future which are determined by two boundary conditions, rather than a measurement and selection. This proposal could solve the issue of the “collapse” of the wavefunction in a new and more natural way: every time a measurement takes place and the possible measurement outcomes decohere, then the future boundary condition simply selects one out of many possible outcomes. It also implies a kind of ‘teleology’ which might prove fruitful in addressing the anthropic and fine tuning issues. The possibility of a final boundary condition on the universe could be probed experimentally by searching for ‘quantum miracles’ on a cosmological scale. While a “classical miracle” is a rare event that can be explained by a very unusual initial boundary-condition, “Quantum Miracles” are those events which cannot naturally be explained through any special initial boundary-condition, only through initial-and-final boundary-conditions. By way of example, destiny-post-selection could be used to create the right dark energy or the right negative pressure.



I am of course pleased to see Aharonov’s conclusion. I originated the idea in terms of brain function in my earlier publications and it is the analysis of the problem as carried out by Aharonov and his colleagues on weak measurement theory and two-time quantum physics that makes the theory feasible.[11] Although we have known for perhaps close to 100 years that quantum physics really cannot be put into a mathematical logical format without realizing that the basic structure of quantum physics is probability theory, the relation between probabilities and physical events, although clear in quantum physics theory, is still fraught with many misunderstanding and needless philosophical meanderings. The main culprit in all of this is the rather dogmatic insistence on temporal causality being a fundamental cornerstone of any physical theory. As time marches on into the next 100 years of quantum physics history, we are beginning to see that causality is not in lockstep with temporal sequence—events that haven’t happened yet do and will have an effect on events that are happening right now. We are not surprised that events that have occurred in the past—no matter how distant or close to the present moment—can and do affect the present. Even though logically the past is no more present than is the future, we might take it for granted that putting causality in lockstep with only a past to present one way flow relationship is merely an old prejudice, one that we have accepted for perhaps thousands of years and most likely caused by our human survival instincts and desire to have a rational universe.

In order to reach any logical form of causality within the range and environment of quantum physics thinking, it has become apparent that both events in the future and past do have a determining effect on the present.  In essence the seemingly magical complex arithmetic of quantum physical calculations become real measureable predictions of the probabilities of events only when we are allowed to run the flows in both directions—from future to present and from past to present simultaneously, so to speak. It takes two events to make one conscious experience.  This appears to be the direction we are taking to finally realize how mind and quantum physical reality relate, if they have any relationship at all.

[1] Wolf, Fred Alan. Star Wave: Mind, Consciousness, and Quantum Physics. New York: Macmillan, 1984.

[2] Cramer, John G. “Generalized Absorber Theory and the Einstein-Podolsky-Rosen Paradox,” Physical Review D 22 (1980), p. 362.

———. “The Transactional Interpretation of Quantum Mechanics,” Reviews of Modern Physics 58: 3 (July 1986), pp. 647–87.



[3] Fred Alan Wolf. The Body Quantum: The New Physics of Body, Mind and Health, for. NY. Macmillan Publishing Co. 1986.Chapter 25. pp 209-18. Also published in England by Heinemann Ltd. and in Germany as Körper, Geist und neue Physik: Eine Synthese der neuesten Erkenntnisse Von Medizin und Moderner Naturwissenschaft Scherz Verlag, 1989

[4] Fred Alan Wolf. Star*Wave: Mind, Consciousness and Quantum Physics. NY: Macmillan October, 1984. Also published as Mind and the New Physics. by Heinemann Ltd. London, England, March, 1985.

[5] Fred Hoyle. The Intelligent Universe. NY:  Holt, Rinehart, & Winston.  1984.

[6] Wolf, Fred Alan. “On the Quantum Physical Theory of Subjective Antedating,” Journal of Theoretical Biology 136 (1989), pp. 13–19.

[7] Libet, B. (1985). Subjective Antedating of a Sensory Experience and Mind-Brain Theories: Reply to Honderich (1984). J. theor. Biol. 114, 563-570.

Libet, B., Wright, E. W., Feinstein, B. & Pearl, D. K. (1979). Brain 102, 193.


[8] Snyder, D. M. Letter to the editor: On the Time of a Conscious Peripheral Sensation. J. theor. Biol. 130, (1988) 253-254.


[9] Wolf, F. A. (1998). The timing of conscious experience. Journal of Scientific Exploration, 12, 4, 511.

[10] Penrose, R. (1989). The Emperor’s New Mind. New York: Penguin Books, p. 442.

[11] Wolf, Fred Alan. “On the Quantum Physical Theory of Subjective Antedating,” Journal of Theoretical Biology 136 (1989), pp. 13–19.

Wolf, F. A. (1998). “The timing of conscious experience.” Journal of Scientific Exploration, 12, 4, 511.

Oct 03

Classical nonlocality of gravity field energy

Posted by: JackSarfatti |
Tagged in: Untagged 

On Oct 3, 2010, at 2:32 PM, Paul Zielinski wrote:

You cannot actually get around this with quasi-local quantities. You have to abandon the classic EP argument that frame
transformations literally cancel first-order geometric distortions of a flat Minkowski geometry in GR.

As you know, I think you are in serious error on this specific point.

You're entitled to your opinion. But if the energy carried by the waves is determined by first-order terms in the expansion of the metric around any given point, then if there is no first-order invariant measure of deviation from Minkowski geometry, we are stuck with a spooky non-local vacuum energy density, which is a serious headache in canonical GR. As we've previously discussed at great length.

Right - but this weird classical nonlocality is accepted by the Pundits. Maybe Hagen Kleinert's math does what you want?

Gravitational energy density is not determined by the Riemann curvature in Einstein's theory. It is determined by purely first order quantities.

Basic problem is that total energy is not well defined in GR except in very special cases when there are timelike Killing vector fields. This corresponds to flat spacetime Noether's theorem where total energy is conserved when there is symmetry under time translations. That is simply not true at all in our actual accelerating expanding universe, for example.

Is Energy Conserved in General Relativity?
In newtonian physics, energy conservation and momentum conservation are two .... We will not delve into definitions of energy in general relativity such as ...
www.phys.ncku.edu.tw/mirrors/.../Relativity/.../energy_gr.html - Cached - Similar
Conservation of energy - Wikipedia, the free encyclopedia
In general relativity conservation of energy-momentum is expressed with the aid of a stress-energy-momentum pseudotensor. ...
History - The first law of thermodynamics - Mechanics - See also
en.wikipedia.org/wiki/Conservation_of_energy - Cached - Similar
energy conservation in General relativity [Archive] - Physics Forums
7 posts - 5 authors - Last post: Apr 10
[Archive] energy conservation in General relativity Special & General Relativity .
www.physicsforums.com › ... › Special & General Relativity - Cached - Similar
8 A Note on General Relativity, Energy Conservation, and Noether's ...
by K Brading - 2005 - Cited by 4 - Related articles
story of the status of energy conservation in General Relativity, concerning two related claims made by Klein and Hilbert: that the energy conservation law ...
www.springerlink.com/index/rv2n078811344m45.pdf - Similar
Energy conservation and equivalence principle in General ...
by MB Mensky - 2004 - Cited by 3 - Related articles
A covariant integral form of the conservation law for the energy–momentum of matter is then derived in General Relativity. ...
linkinghub.elsevier.com/retrieve/pii/S0375960104008527 - Similar
[gr-qc/0409121] Energy conservation and equivalence principle in ...
by MB Mensky - 2004 - Cited by 3 - Related articles
Sep 30, 2004 ... A covariant integral form of the conservation law for the energy-momentum of matter is then derived in General Relativity. ...
arxiv.org › gr-qc - Cached
Energy Is Not Conserved | Cosmic Variance | Discover Magazine
Feb 22, 2010 ... TimG– No, even that kind of energy conservation is not true. ... “in general relativity spacetime can give energy to matter, ...
blogs.discovermagazine.com/.../02/.../energy-is-not-conserved/ - Cached - Similar
Sean Carroll: Lecture Notes on General Relativity - 11:56am
Try the No-Nonsense Introduction to General Relativity, a 24-page condensation ... energy-momentum tensor -- perfect fluids -- energy-momentum conservation ...
preposterousuniverse.com/grnotes/ - Cached - Similar
Energy Conservation « viXra log
This form of the law of energy conservation in general relativity tells us that the energy from all these contributions plus the gravitational contribution ...
blog.vixra.org/category/energy-conservation/ - Cached
[DOC] General Relativistic Violation of the Conservation of Energy Law
File Format: Microsoft Word - View as HTML
So long as the Killing vector symmetry holds, the energy conservation law is applicable and is not violated, even under general relativity. ...
www.cheniere.org/.../ General%20Relativistic%20Violation%20of%20the%20Conservatio... -Similar

I think everyone who matters in this field now admits that this is a major headache in GR.
But the energy carried by the waves is *first* order in the metric.
This is different from Hagen-Kleinert's multi-valued maps corresponding to real invariant topological, e.g. disclination defects that are not found in Einstein's 1916 restricted GCTs. What you were looking for Z is in Kleinert's work.

No, what I was looking for is what I found. No need for Kleinert's model to solve the problem. You just have to get over the root misconception that the LC covariant derivative is the only curved-space covariant derivative that can be legitimately defined in GR without altering the underlying intrinsic geometry.

You cannot introduce new connections without justification. It's fine to introduce torsion and non-metricity tensor additions to Levi-Civita only if they explain data anomalies like the Pioneer, flyby etc. Otherwise, ghostly connections without empirical necessity are excess metaphysical baggage in my opinion - they are less with more.

On Oct 3, 2010, at 1:57 PM, Paul Zielinski wrote:

On Sun, Oct 3, 2010 at 1:22 PM, JACK SARFATTI <sarfatti@pacbell.net> wrote:
see your Sean Carroll notes

basic ambiguity is split between background and wave metrics

Yes of course.

physical metric = background metric + gravity wave metric

The argument was made that since any gravity wave could be eliminated ("zeroed out") by a suitable choice of coordinates,
gravity waves were not real. This is closely related to the EP argument against objective local gravitational energy. I understand that in the case of gravity waves at least this argument is no longer taken seriously.

also need to restrict to Minkowski background for Hulse-Taylor to get the quadrupole formula that is actually measured.

Of course -- because the actual gravity wave is defined as the coordinate-invariant deviation from flat vacuum geometry.

The true observable is the 4th rank curvature tensor of the complete physical metric field.

This is what I've been saying all along. You cannot actually get around this with quasi-local quantities. You have to abandon the classic EP argument that frame transformations literally cancel first-order geometric distortions of a flat Minkowski geometry in GR.

As you know, I think you are in serious error on this specific point. The first order Levi-Civita connections are only zero at the Centers Of Mass (COMs) of the LIFs that are not rotating and on timelike geodesics of the complete metric. If you exceed the local curvature radius at that COM then you are in a new LIF' and need to zero the Levi Civita terms at that new COM'.  The covariant curl of the Levi-Civita connection is not zero'd out by this NON-SINGULAR local transformation. This is different from Hagen-Kleinert's multi-valued maps corresponding to real invariant topological, e.g. disclination defects that are not found in Einstein's 1916 restricted GCTs. What you were looking for Z is in Kleinert's work.

Amazon.com: Multivalued Fields: In Condensed Matter ...
Multivalued Fields: In Condensed Matter, Electromagnetism, and Gravitation by HagenKleinert Paperback 4.0 out of 5 stars (1) ...
www.amazon.com › ... › Physics › Mathematical Physics - Cached
Multivalued fields in condensed matter, electromagnetism, and ... - Google Books Result
Hagen Kleinert - 2008 - Science - 497 pages
Kleinert's work includes flexible manipulation of coordinates that led to his evaluation of the path integral for the Coulomb potential.
Multivalued function - Wikipedia, the free encyclopedia
The multivalued function corresponds to this inverse relation. .... 1992; Kleinert, Hagen,Multivalued Fields in in Condensed Matter, Electrodynamics, ...
en.wikipedia.org/wiki/Multivalued_function - Cached - Similar
Multivalued function
Kleinert, Hagen, "Multivalued Fields in in Condensed Matter, Electrodynamics, and Gravitation", [http://www.worldscibooks.com/physics/6742.html World ...
en.academic.ru/dic.nsf/enwiki/142144 - Cached
Phys. Rev. D 81, 084030 (2010): Jizba et al. - Uncertainty ...
by P Jizba - 2010 - Cited by 3 - Related articles
Apr 16, 2010 ... H. Kleinert, Multivalued Fields in Condensed Matter, Electromagnetism, and Gravitation (World Scientific, Singapore, 2008). ...
Homepage of Hagen Kleinert
H. Kleinert, Multivalued Fields World Scientific, Singapore 2008. The Italian artist Laura Pesce was inspired by the theory and created ...
users.physik.fu-berlin.de/~kleinert/kleinert/?p=worldcrystal - Cached - Similar
Field transformations to multivalued fields
by H Kleinert - 2007 - Related articles
Field transformations to multivalued fields. Author. H Kleinert. Affiliations. Institut für Theoretische Physik, Arnimallee 14, D-14195 Berlin, Germany ...
Multivalued fields
by H Kleinert - 2009 - Cited by 4 - Related articles
Multivalued fields. Author. H Kleinert. Affiliations. Institut für Theoretische Physik, Arnimallee 14, D14195 Berlin, Germany ...
CiteSeerX — MULTIVALUED FIELDS in Condensed Matter ...
by H Kleinert
author = {Hagen Kleinert}, title = {MULTIVALUED FIELDS in Condensed Matter, Electromagnetism, and GravitationMultivalued Fields in Condensed Matter, ...
citeseerx.ist.psu.edu/viewdoc/summary?doi= - Cached
Multivalued Fields In Condensed Matter Electromagnetism and ...
Feb 4, 2008 ... Multivalued Fields: In Condensed Matter, Electromagnetism, and Gravitation by Hagen Kleinert. (Paperback 9789812791719)
www.paperbackswap.com/Multivalued...Kleinert/.../981279171X/ - Cached

gravity wave is because the source is not on a timelike geodesic relative to the full physical metric.
e.g. the Hulse-Taylor case - on timelike geodesics relative to background metric that accelerate relative to full physical metric?

There should be no radiation on true geodesics - this is one aspect of the classical nonlocality of gravity energy.
It's very tricky.

Then each LIF constitutes a preferred frame relative to which accelerating motion of gravitational sources is judged.
Is this really what you mean to argue here?

Valentini thinks the low power anomaly at wide angles in the WMAP cosmic microwave background temperature fluctuation data may be evidence of signal nonlocality in the pre-inflation false vacuum - this would be a violation of orthodox quantum theory.

It may be shown that quantum nonequilibrium for entangled systems leads to nonlocal signals at the statistical level, in pilot-wave theory (as already mentioned) and indeed in any deterministic hidden-variables theory; while in equilibrium, the under- lying nonlocal effects cancel out at the statistical level [14,15,18,19,26]. Locality is therefore a contingency (or emergent feature) of the equilibrium state. Similarly, standard uncertainty-principle limitations on measurements are also contingencies of equilibrium [14,15,20,24]. These results provide an explanation for the otherwise mysterious ‘‘conspiracy’’ in the foundations of current physics, according to which (roughly speaking) quantum noise and the uncertainty principle prevent us from using quantum nonlocality for practical nonlocal signaling. From the above perspective, this ‘‘conspiracy’’ is not part of the laws of physics, but merely a contingent feature of the equilibrium state (much as the inability to convert heat into work, in a state of global thermal equilibrium, is not a law of physics but a contingency of the state). On this view, quantum physics is merely the effective description of a particular state—just as, for example, the standard model of particle physics is merely the effective description of (perturbations around) a particular vacuum state (arising from spontaneous symmetry breaking). If one takes this view seriously, it suggests that nonequilibrium phenomena should exist somewhere (or some time) in our Universe. And again, the early Universe seems the natural place to look. -- Valentini

In the case of particle mechanics (for simplicity), the hidden variables are the particle positions on some spacelike hypersurface. Clearly if these particles are pumped they are kept off thermal equilibrium and should show signal nonlocality with statistical behavior violating the Born rule. Valentini only considers closed systems without any external pump. Furthermore, if these particles are in a Bose-Einstein condensate, the macro-quantum coherent phase rigidity will definitely cause large departures away from the statistical predictions of ordinary quantum theory in which these same particles are independent forming an ensemble.
Oct 03

How much zero point energy is in the vacuum?

Posted by: JackSarfatti |
Tagged in: Untagged 

On Oct 2, 2010, at 4:28 PM, james f woodward wrote:

Well, I'm not trying to pick a fight here, as the only thing that really
matters is if there is any way to make ABWs.  But I should point out that
Thorne is the one who stipulates exotic rest mass. 

So Kip made a minor mistake, i.e., a too restrictive premise. He did that back in 1986 before discovery of the accelerating universe.

Also near EM fields are macro-quantum coherent states of virtual photons including longitudinal polarizations, they have zero rest mass, but still have a Tuv that makes a Guv bending spacetime.

Furthermore, in a meta-material if both permittivity and permeability are negative, the near EM field energy density is negative - despite some people claiming otherwise, they are in error in my opinion.

The next hypothesis to be tested is that it is the actual speed of light in a material that determines the gravity coupling of that material i.e.

the coupling is (index of refraction)^4G/(vacuum speed of light)^4

this is Popper falsifiable.

That's why I quoted his appendix where he lays out the requirements of ABWs.

But let's assume that you are proposing something equivalent to rest mass
in the form of, say, a hypothetical structure that induces a state of
exoticity in its vacuum interstices.  (Sounds like a metamaterial, eh?)
The problem here is that there is no latent state of exoticity in the
vacuum to be exposed by such a structure.   Mean cosmic matter density is
on the order of 10^-27 gm/cm^3 (perhaps less), 70% of which is exotic
dark energy.  The Jupiter mass of exotic matter you need for an ABW is on
the order of 10^22 gm/cm^3.  That's nearly 50 orders of magnitude
difference.  If that were in the form of virtual particles in the vacuum,
its mean energy density would not be zero pending exposure in a
metamaterial.  It would turn the entire universe into a wormhole.  What
you have is sort of the inverse of the "cosmological constant problem".
When you try to put stuff into the vacuum as virtual processes, there is
nothing else there to dress it to latency that can be manipulated to
expose the latent quantity.

While the vacuum is an unconvincing repository for humungous amounts of
latent exotic matter,

Actually this is probably not true at all. The evidence is that there is only tiny ambient virtual particle density ~ 10^-29 gm/cc and that the old Wheeler estimate of hc/Lp^4 is completely wrong! The hologram theory explains why that is the case i.e. hc/(area of future horizon)(Lp)^2

virtual particles or not, normal matter is a quite
reasonable repository.  Indeed, even in the standard model bare masses
are idiotically large and negative.  The standard model seems not to have
any reasonable process that might expose the latent exotic bare mass.
But the ADM model with Mach's principle does allow for such exposure.

So, arguably, there is no conceptual flaw. . . .

On Sat, 2 Oct 2010 13:13:33 -0700 JACK SARFATTI <sarfatti@pacbell.net>
Paper is very useful in general and I will quote it in my book Star
On Oct 2, 2010, at 10:18 AM, JACK SARFATTI wrote:

"The problem with absurdly benign wormholes of traversable
dimensions is that they require „exotic‰ rest mass matter ˆ that is,
real stuff you can stop in the lab that has negative mass."

Not necessary. Virtual particles inside the vacuum directly bend
spacetime because of the equivalence principle. The premise you make
is sufficient, but not necessary.
Oct 02

From Feynman's Cal Tech lectures for zero mass
Spin 1 gravitons would show anti-gravity like dark energy, i.e. likes repel.
Spin 0 violates equivalence principle for gravity binding energies as a gas gets hotter. 10/3/10

Obviously since gravity is simply the local gauge theory of the de Sitter group the real question is why do we not see zero mass spin 0 and spin 1 gravity waves in addition to spin 2 gravity waves? Why do spin 0 and spin 1 get a large rest mass for example? Also why is not the spin 0 graviton the Higgs particle since it must universally couple to everything because of the equivalence principle? Can we also have spin 1 Higgs particles? Why not?
On Zielinski's question,vI agree with Rovelli. The point is that small elements of rotating water in ii are not on timelike geodesics in the local curved spacetime. They are pushed off geodesics by the friction/viscosity. Note that in rotating superfluids quantized vortices form with normal fluid cores that have viscosity. The macro-quantum coherent phase rigidity (P.W. Anderson) acts as an effective quantum potential quasi-force to push the superfluid elements off the local geodesics.
In the hologram theory, the curved spacetime of Einstein's GR is simply a retro-causal 3D + 1 image from the future event horizon hologram computer 2D + 1 screen.  Admittedly this is a really crazy idea. It may be crazy enough to be true.
The fat lady has not sung yet on this idea.

Oct 02

Lenny Susskind video on quantum field theory

Posted by: JackSarfatti |
Tagged in: Untagged 
Lenny Susskind Stanford University video lecture on quantum field theory.
Oct 02

STUXNET Attack on Iran?

Posted by: JackSarfatti |
Tagged in: Untagged 

From Demo Hassan


"A study of the spread of Stuxnet by Symantec
showed that the main affected countries as
of August 6 2010 were:

Country Infected Computers

       Iran           62,867
       Indonesia      13,336
       India           6,552
       United States   2,913
       Australia       2,436
       Britain         1,038
       Malaysia        1,013
       Pakistan          993
       Germany            15

Symantec estimates that the group developing
Stuxnet would have been well-funded, consisting
of five to ten people, and would have taken
six months to prepare.

Symantec claims that the majority of infected
systems were in Iran (about 60%), which has led
to speculation that it may have been deliberately
targeting "high-value infrastructure" in Iran
including either the Bushehr Nuclear Power Plant
or the Natanz nuclear facility.
Ralph Langner, a German cyber-security researcher,
called the malware "a one-shot weapon" and said
that the intended target was probably hit, although
he admitted this was speculation.

There are reports that Iran's uranium enrichment
facility at the Natanz facility was the target of
Stuxnet and the site sustained damage because of
it causing a sudden 15% reduction in its production
capabilities. There was also a previous report by
wikileaks... [  http://wikileaks.org/
       http://twitter.com/wikileaks  ] ...disclosing
a "nuclear accident" at the said site in 2009...
http://en.wikipedia.org/wiki/Stuxnet "

Oct 01

There is a serious physics error in the Discover article:

Another example of the trickiness of coordinates, drawn from general relativity, is the black hole. In the canonical Schwarschild coordinates describing a black hole, it looks like terrible things (e.g., singularities) happen at the event horizon [or Schwarschild radius, which represents the 'surface'] of the black hole. But these are a problem with the coordinates. In truth, nothing particularly weird happens as you cross the surface of a black hole (besides gravitational lensing causing the sky to appear bent and warped). This can be seen by writing the exact same spacetime in different coordinates (e.g., Kruskal coordinates), where everything becomes well behaved (except for the singularity itself). No big deal crossing the event horizon (though all hell breaks loose as you approach the singularity). A similar confusion resided in the nature of gravitational waves.

Something weird does happen if you are lowered on a cable, or have rocket engines thrusting to the center of the black hole. There is the quantum Unruh effect, such static observers will see black body radiation of temperature

kBT(r) = (hcrs/r^2)(1 - rs/r)^-1/2  + hc/rs---> infinity as r ---> rs+

whilst a locally coincident geodesic observer only sees the Hawking black body radiation of temperature hc/rskB

Also, these static observers will also collide with in-falling matter that is infinitely blue shifted at the event horizon.

On Sep 30, 2010, at 4:55 PM, nick herbert wrote:
Here's a nice blog post by daniel holz (feynman professor at Los Alamos) about the history of gravitational waves:
The comments are particularly amusing and instructive.
Since gravity is so weak there are few opportunities to test Einstein's theory (and its competitors) experimentally.
The newest test is the lifetime of the Hulse-Taylor binary pulsar which agrees with GR to 0.2% and for which Hulse & Taylor
appropriately received the 1993 Nobel Prize. First indirect measurement of Einsteinian gravitational waves.
The intensity of such waves depends on how many polarization degrees of freedom one assigns to the graviton. For a speed of light spin-2 graviton the number of degrees is 2. The intensity of G-radiation depends directly on these degrees of  freedom so if you double  the degrees of freedom you double the intensity and seriously mess up the agreement with experiment. Graviton degrees of freedom is not a small effect.
Jack has proposed a clever path to quantum gravity that invokes spin-1 tetrads as the basis out of which the spin-2 graviton is composed and he has not failed to recognize that in quantum mechanics 1 + 1 can add up not only to 2 but to zero and to one also. Therefore on the face of it,  Jack's suggestion predicts not only a spin-2 graviton but a spin-1 and a spin-0 graviton as well--for which I propose the names scalar and vector Sarfattions.

The spin 0 and spin 1 gravitons may have rest mass. As you point out the Hulse-Taylor data only shows massless spin 2. Yukawa and all that.
Jack is not usually shy about announcing his latest discoveries. Where is Jack's press release concerning his prediction of these brand new gravitational force particles? Depending on the details of how they emerge from the Basic Tetrad, these two new Sarfattions could add at least 3 new polarization degrees of freedom (one for the scalar Sarfattion and two for the vector Sarfattion) to standard GR.
But the Hulse-Taylor results show that GR does not need extra polarization degrees of freedom. Therefore in order for the universe to make sense:
Perhaps the first step in the necessary suppression of Sarfattions is to keep them from being mentioned at scientific conferences. Or even in private e-mail conversations such as this one. Since I have already let the cat out of the bag I offer a compromise--that the three Sarfattions be referred to only by their initials in a kind of secret code known only to the inner circle. Thus SS, VS, and TS for the scalar, vector and tensor "you-know-whats".
Jack's new "you-know-what" particles possess one unique property. They are one of the few new particles NOT PREDICTED TO BE SEEN at the Large Hadron Collider.

Good point, especially if they have rest mass. In any case, the Hulse-Taylor data shows that if they exist they are not massless, therefore, we can't detect them in the far-field.
Nick Herbert