You are here:
Home Jack Sarfatti's Blog French paper explaining renormalizability

We know now that the invisible hand that creates divergences in some theories is actually the existence in these theories of a no man’s land in the energy (or length) scales for which cooperative phenomena can take place, more precisely, for which fluctuations can add up coherently.10 In some cases, they can destabilize the physical picture we were relying on and this manifests itself as divergences. Renormalization, and even more renormalization group, is the right way to deal with these fluctuations. ...

Let us draw our first conclusion. Infinities occur in the perturbation expansion of the theory because we have assumed that it was not regularized. Actually, these divergences have forced us to regularize the expansion and thus to introduce a new scale Λ. Once regularization has been performed, renormalization can be achieved by eliminating g0. The limit Λ → ∞ can then be taken. The process is recursive and can be performed only if the divergences possess, order by order, a very precise structure. This structure ultimately expresses that there is only one coupling constant to be renormalized. This means that imposing only one prescription at x = μ is enough to subtract the divergences for all x. In general, a theory is said to be renormalizable if all divergences can be recursively subtracted by imposing as many prescriptions as there are independent parameters in the theory. In QFT, these are masses, coupling constants, and the normalization of the fields. An important and non-trivial topic is thus to know which parameters are independent, because symmetries of the theory (like gauge symmetries) can relate different parameters (and Green functions).

Let us once again recall that renormalization is nothing but a reparametrization in terms of the physical quantity gR. The price to pay for renormalizing F is that g0 becomes infinite in the limit Λ → ∞, see Eq. (12). We again emphasize that if g0 is a non-measurable parameter, useful only in intermediate calculations, it is indeed of no consequence that this quantity is infinite in the limit Λ → ∞. That g0 was a divergent non-physical quantity has been common belief for decades in QFT. The physical results given by the renormalized quantities were thought to be calculable only in terms of unphysical quantities like g0 (called bare quantities) that the renormalization algorithm could only eliminate afterward. It was as if we had to make two mistakes that compensated each other: first introduce bare quantities in terms of which everything was infinite, and then eliminate them by adding other divergent quantities. Undoubtedly, the procedure worked, but, to say the least, the interpretation seemed rather obscure. ...

A very important class of field theories corresponds to the situation where g0 is dimensionless, and x, which in QFT represents coordinates or momenta, has dimensions (or more generally when g0 and x have independent dimensions). In four-dimensional space-time, quantum electrodynamics is in this class, because the fine structure constant is dimensionless; quantum chromodynamics and the Weinberg-Salam model of electro-weak interactions are also in this class. In four space dimensions the φ^4 model relevant for the Ginzburg-Landau-Wilson approach to critical phenomena is in this class too. This particular class of renormalizable theories is the cornerstone of renormalization in field theories. ...

Note that we have obtained logarithmic divergences because we have studied the renormalization of a dimensionless coupling constant. If g0 was dimensional, we would have obtained power law divergences. This is for instance what happens in QFT for the mass terms ...

there should exist an equivalence class of parametrizations of the same theory and that it should not matter in practice which element in the class is chosen. This independence of the physical quantity with respect to the choice of prescription point also means that the changes of parametrizations should be a (renormalization) group law ...

if we were performing exact calculations: we would gain no new physical information by implementing the renormalization group law. This is because this group law does not reflect a symmetry of the physics, but only of the parametrization of our solution. This situation is completely analogous to what happens for the solution of a differential equation: we can parametrize it at time t in terms of the initial conditions at time t0 for instance, or we can use the equation itself to

calculate the solution at an intermediate time τ and then use this solution as a new initial condition to parametrize the solution at time t. The changes of initial conditions that preserve the final solution can be composed thanks to a group law. ...

Our main goal in this section is to show that, independently of the underlying physical model, dimensional analysis together with the renormalizability con- straint, determine almost entirely the structure of the divergences. This underlying simplicity of the nature of the divergences explains that there is no combinatorial miracle of Feynman diagrams in QFT as it might seem at first glance.

8) The cut-off Λ, first introduced as a mathematical trick to regularize integrals, has actually a deep physical meaning: it is the scale beyond which new physics occur and below which the model we study is a good effective description of the physics. In general, it involves only the renormalizable couplings and thus cannot pretend to be an exact description of the physics at all scales. However, if Λ is very large compared with the energy scale in which we are interested, all non-renormalizable couplings are highly suppressed and the effective model, retaining only renormalizable couplings, is valid and accurate (the Wilson RG formalism is well suited to this study, see Refs. 25 and 26). In some models — the asymptotically free ones — it is possible to formally take the limit Λ → ∞ both perturbatively and non-perturbatively, and there is therefore no reason to invoke a more fundamental theory taking over at a finite (but large) Λ. ...

V. SUMMARY

(1) The long way of renormalization starts with a theory depending on only one parameter g0, which is the small parameter in which perturbation series are expanded. In particle physics, this parameter is in general a coupling constant like an electric charge involved in a Hamiltonian (more precisely the fine structure constant for electrodynamics). This parameter is also the first order contribution of a physical quantity F. In particle/ statistical physics, F is a Green/correlation function. The first order of perturbation theory neglects fluctuations — quantum or statistical — and thus corresponds to the classical/mean field approximation. The parameter g0 also is to this order a measurable quantity because it is given by a Green function. Thus, it is natural to interpret it as the unique and physical coupling constant of the problem. If, as we suppose in the following, g0 is dimensionless, so is F. Moreover, if x is dimensional — it represents momenta in QFT — it is natural that F does not depend on it as is found in the classical theory, that is, at first order of the perturbation expansion.

(2) If F does depend on x, as we suppose it does at second order of perturbation theory, it must depend on another dimensional parameter, through the ratio of

x and /\ . If we have not included this parameter from the beginning in the model, the x-dependent terms are either vanishing, which is what happens at first order, or infinite as they are at second and higher orders. This is the very origin of divergences (from the technical point of view).

(3) These divergences require that we regularize F. This requirement, in turn, requires the introduction of the scale that was missing. In the context of field

theory, the divergences occur in Feynman diagrams for high momenta, that is, at short distances. The cut-off suppresses the fluctuations at short distances compared with /\^−1. "

Note with quantum gravity tiny black holes form at short distances giving a natural cut off at Lp at least where

delta(x) ~ h/delta(p) + Lp^2delta(p)/h

the second term on RHS is added to the Heisenberg uncertainty principle.

"In statistical physics, this scale, although introduced for formal reasons, has a natural interpretation because the theories are always effective theories built at a given microscopic scale. It corresponds in general to the range of interaction of the constituents of the model, for example, a lattice spacing for spins, the average intermolecular distance for fluids. In particle physics, things are less simple. At least psychologically. It was indeed natural in the early days of quantum electrodynamics to think that this theory was fundamental, that is, not derived from a more fundamental theory. More precisely, it was believed that QED had to be mathematically internally consistent, even if in the real world new physics had to occur at higher energies. Thus, the regulator scale was introduced only as a trick to perform intermediate calculations. The limit /\ → ∞ was supposed to be the right way to eliminate this unwanted scale, which anyway seemed to have no interpretation. We shall see in the following that the community now interprets the renormalization process differently (4) Once the theory is regularized, F can be a nontrivial function of x. The price is that different values of x now correspond to different values of the coupling constant (defined as the values of F for these x). Actually, it does no longer make sense to speak of a coupling constant in itself. The only meaningful concept is the pair (μ, gR(μ)) of coupling constants at a given scale. The relevant question now is, “What are the physical reasons in particle/statistical physics that make the coupling constants depend on the scale while they are constants in the classical/mean field approximation?” As mentioned, for particle physics, the answer is the existence of new quantum fluctuations corresponding to the possibility of creating (and annihilating) particles at energies higher than mc^2. What was scale independent in the classical theory becomes scale dependent in the quantum theory because, as the available energy increases, more and more particles can be created. The pairs of (virtual) particles surrounding an electron are polarized by its presence and thus screen its charge. As a consequence, the charge of an electron depends on the distance (or equivalently the energy) at which it is probed, at least for distances smaller than the Compton wavelength. Note that the energy scale mc^2 should not be confused with the cut-off scale . mc^2 is the energy scale above which quantum fluctuations start to play a significant role while /\ is the scale where they are cut-off. Thus, although the Compton wave length is a short distance scale for the classical theory, it is a long distance scale for QFT, the short one being /\^−1.

There are thus three domains of length scales in QFT: above the Compton wave length where the theory behaves classically (up to small quantum corrections coming from high energy virtual processes), between the Compton wave length and the cut-off scale /\^−1 where the relativistic and quantum fluctuations play a great role, and below /\^−1 where a new, more fundamental theory has to be invoked.10

In statistical physics, the analogue of the Compton wave length is the correlation length which is a measure of the distance at which two microscopic constituents of the system are able to influence each other through thermal fluctuations.38 For the Ising model for instance, the correlation length away from the critical point is the order of the lattice spacing and the corrections to the mean-field approximation due to fluctuations are small. Unlike particle physics where the masses and therefore the Compton wavelengths are fixed, the correlation lengths in statistical mechanics can be tuned by varying the temperature. Near the critical temperature where the phase transition takes place, the correlation length becomes extremely large and fluctuations on all length scales between the microscopic scale of order /\^−1, a lattice spacing, and the correlation length add up to modify the mean-field behavior (see Refs. 21, 22 and also Ref. 23 for a bibliography in this subject). We see here a key to the relevance of renormalization: two very different scales must exist between which a non-trivial dynamics (quantum or statistical in our examples) can develop. This situation is a priori rather unnatural as can be seen for phase transitions, where a fine tuning of temperature must be implemented to obtain correlation lengths much larger than the microscopic scale. Most of the time, physical systems have an intrinsic scale (of time, energy, length, etc) and all the other relevant scales of the problem are of the same order. All phenomena occurring at very different scales are thus almost completely suppressed. The existence of a unique relevant scale is one of the reasons why renormalization is not necessary in most physical theories. In QFT it is mandatory because the masses of the known particles are much smaller than a hypothetical cut-off scale , still to be discovered, where new physics should take place. This is a rather unnatural situation, because, contrary to phase transitions, there is no analogue of a temperature that could be fine-tuned to create a large splitting of energy, that is, mass, scales. The question of naturalness of the models we have at present in particle physics is still largely open, although there has been much effort in this direction using supersymmetry.

(5) The classical theory is valid down to the Compton/correlation length, but cannot be continued naively beyond this scale; otherwise, when mixed with the quantum formalism, it produces divergences. Actually, it is known in QFT that the fields should be considered as distributions and not as ordinary functions. The need for considering distributions comes from the non-trivial structure of the theory at very short length scale where fluctuations are very important. At short distances, functions are not sufficient to describe the field state, which is not smooth but rough, and distributions are necessary. Renormalizing the theory consists actually in building, order by order, the correct “distributional continuation” of the classical theory. The fluctuations are then correctly taken into account and depend on the scale at which the theory is probed: this non-trivial scale dependence can only be taken into account theoretically through the dependence of the (analogue of the) function F with x and thus of the coupling with the scale μ.

(6) If the theory is perturbatively renormalizable, the pairs (μ, g(μ)) form an equivalence class of parametrizations of the theory. The change of parametrization from (μ, g(μ)) to (μ′, g(μ′)), called a renormalization group transformation, is then performed by a law which is self-similar, that is, such that it can be iterated several times while being form-invariant.19,20 This law is obtained by the integration of

... In particle physics, the β-function gives the evolution of the strength of the interaction as the energy at which it is probed varies and the integration of the β-function resums partially the perturbation expansion. First, as the energy increases, the coupling constant can decrease and eventually vanish. This is what happens when α > 0 in Eqs. (65) and (66). In this case, the particles almost cease to interact at very high energies or equivalently when they are very close to each other. The theory is then said to be asymptotically free in the ultraviolet domain.3,5 Reciprocally, at low energies the coupling increases and perturbation theory can no longer be trusted. A possible scenario is that bound states are created at a sufficiently low energy scale so that the perturbation approach has to be reconsidered in this domain to take into account these new elementary excitations. Non-abelian gauge theories are the only known theories in four spacetime dimensions that are ultraviolet free, and it is widely believed that quantum chromodynamics — which is such a theory — explains quark confinement. The other important behavior of the scale dependence of the coupling constant is obtained for α < 0 in which case it increases at high energies. This corresponds for instance to quantum

electrodynamics. For this kind of theory, the dramatic increase of the coupling at high energies is supposed to be a signal that the theory ceases to be valid beyond a certain energy range and that new physics, governed by an asymptotically free theory (like the standard model of electro-weak interactions) has to take place at short distances.

(7) Renormalizability, or its non-perturbative equivalent, self-similarity, ensures that although the theory is initially formulated at the scale μ, this scale together

with g0 can be entirely eliminated for another scale better adapted to the physics we study. If the theory was solved exactly, it would make no difference which parametrization we used. However, in perturbation theory, this renormalization lets us avoid calculating small numbers as differences of very large ones. It would indeed be very unpleasant, and actually meaningless, to calculate energies of order 100GeV, for instance — the scale μ of our analysis — in terms of energies of order of the Planck scale ? 10^19 GeV, the analogue of the scale /\ . In a renormalizable theory, the possibility to perturbatively eliminate the large scale has a very deep meaning: it is the signature that the physics is short distance insensitive or equivalently that there is a decoupling of the physics at different scales. The only memory of the short distance scale lies in the initial conditions of the renormalization group flow, not in the flow itself: the β-function does not depend on /\. We again emphasize that, usually, the decoupling of the physics at very different scales is trivially related to the existence of a typical scale such that the influence of all phenomena occurring at different scales is almost completely suppressed. Here, the decoupling is much more subtle because there is no typical length in the whole domain of length scales that are very small compared with the Compton wave length and very large compared with /\^−1. Because interactions among particles correspond to non-linearities in the theories, we could naively believe that all scales interact with each others — which is true — so that calculating, for instance, the low energy behavior of the theory would require the detailed calculation of all interactions occurring at higher energies. Needless to say that in a field theory, involving infinitely many degrees of freedom — the value of the field at each point — such a calculation would be hopeless, apart from exactly solvable models. Fortunately, such a calculation is not necessary for physical quantities that can be calculated from renormalizable couplings only. Starting at very high energies, typically , where all coupling constants are naturally of order 1, the renormalization group flow drives almost all of them to zero, leaving only, at low energies, the renormalizable couplings. This is the interpretation of non-renormalizable couplings. They are not terrible monsters that should be forgotten as was believed in the early days of QFT. They are simply couplings that the RG flow eliminates at low energies. If we are lucky, the renormalizable couplings become rather small after their RG evolution between /\ and the scale μ at which we work, and perturbation theory is valid at this scale. We see here the phenomenon of universality: among the infinitely many coupling constants that are a priori necessary to encode the dynamics of the infinitely many degrees of freedom of the theory, only a few ones are finally relevant.25 All the others are washed out at large distances. This is the reason why, perturbatively, it is not possible to keep these couplings finite at large distance, and it is necessary to set them to zero.39 The simplest non-trivial example of universality is given by the law of large numbers (the central limit theorem) which is crucial in statistical mechanics.21 In systems where it can be applied, all the details of the underlying probability

distribution of the constituents of the system are irrelevant for the cooperative phenomena which are governed by a gaussian probability distribution.24 This drastic reduction of complexity is precisely what is necessary for physics because it lets us build effective theories in which only a few couplings are kept.10 Renormalizability in statistical field theory is one of the non-trivial generalizations of the central limit theorem.

(8) The cut-off , first introduced as a mathematical trick to regularize integrals, has actually a deep physical meaning: it is the scale beyond which new physics occur and below which the model we study is a good effective description of the physics. In general, it involves only the renormalizable couplings and thus cannot pretend to be an exact description of the physics at all scales. However, if is very large compared with the energy scale in which we are interested, all non-renormalizable couplings are highly suppressed and the effective model, retaining only renormalizable couplings, is valid and accurate (the Wilson RG formalism is well suited to this study, see Refs. 25 and 26). In some models — the asymptotically free ones — it is possible to formally take the limit /\ → ∞ both perturbatively and non-perturbatively, and there is therefore no reason to invoke a more fundamental theory taking over at a finite (but large) . Let us

emphasize here several interesting points.

(i) For a theory corresponding to the pair (μ, gR(μ)), the limit /\ → ∞ must be taken within the equivalence class of parametrizations to which (μ, gR(μ)) belongs.40 A divergent non-regularized perturbation expansion consists in taking = ∞ while keeping g0 finite. From this viewpoint, the origin of the divergences is that the pair ( u = ∞, g0) does not belong to any equivalence class of a sensible theory. Perturbative renormalization consists in computing g0 as a formal powers series in gR (at finite ), so that (μ0, g0) corresponds to a mathematically consistent theory; we then take the limit /\ → ∞.

(ii) Because of universality, it is physically impossible to know from low energy data if /\ is very large or truly infinite.

(iii) Although mathematically consistent, it seems unnatural to reverse the RG process while keeping only the renormalizable couplings and thus to imagine that even at asymptotically high energies, Nature has used only the couplings that we are able to detect at low energies. It seems more natural that a fundamental theory does not suffer from renormalization problems. String theory is a possible candidate. 27

To conclude, we see that although the renormalization procedure has not evolved much these last thirty years, our interpretation of renormalization has drastically changed10: the renormalized theory was assumed to be fundamental, while it is now believed to be only an effective one; was interpreted as an artificial parameter that was only useful in intermediate calculations, while we now believe that it corresponds to a fundamental scale where new physics occurs; non-renormalizable couplings were thought to be forbidden, while they are now interpreted as the remnants of interaction terms in a more fundamental theory. Renormalization group is now seen as an efficient tool to build effective low energy theories when large fluctuations occur between two very different scales that change qualitatively and quantitatively the physics."

complete paper http://arxiv.org/pdf/hep-th/0212049v3

e.g. the spin 1 tetrad vector e^I theory of gravity is to Einstein's spin 2 metric tensor guv theory of gravity as the SU2 gauge theory of the weak force is to the 4spinor model of Fermi that was not renormalizable.

Category: MyBlog

Written by Jack Sarfatti

Published on Tuesday, 16 November 2010 09:50

't Hooft 100 Year Star Ship Abner Shimony accelerometers action-reaction principle Aephraim Sternberg Alan Turing Albert Einstein Alpha Magnetic Spectrometer American Institute of Physics Andrija Puharich Anthony Valentin Anton Zeilinger Antony Valentini anyon Apple Computer Artificial Intelligence Asher Peres Back From The Future Basil Hiley Bell's theorem Ben Affleck Ben Libet Bernard Carr Bill Clinton black body radiation Black Hole black hole firewall black hole information paradox black holes Bohm brain waves Brian Josephson Broadwell Cambridge University Carnot Heat Engine Central Intelligence Agency CIA Clive Prince closed time like curves coherent quantum state Consciousness conservation laws Cosmic Landscape Cosmological Constant cosmology CTC cyber-bullying Dancing Wu Li Masters Dark Energy Dark Matter DARPA Daryl Bem David Bohm David Deutsch David Gross David Kaiser David Neyland David Tong de Sitter horizon Dean Radin Deepak Chopra delayed choice Demetrios A. Kalamidas Demetrios Kalamidas Dennis Sciama Destiny Matrix Dick Bierman Doppler radars E8 group Einstein's curved spacetime gravity Einstein's happiest thought electromagnetism Eli Cartan EMP Nuclear Attack entanglement signals ER=EPR Eric Davis Ernst Mach ET Eternal Chaotic Inflation evaporating black holes Facebook Faster-Than-Light Signals? fictitious force firewall paradox flying saucers FQXi Frank Tipler Frank Wilczek Fred Alan Wolf Free Will G.'t Hooft Garrett Moddel Gary Zukav gauge theory general relativity Geometrodynamics Gerard 't Hooft Giancarlo Ghirardi God Goldstone theorem gravimagnetism gravity Gravity - the movie gravity gradiometers gravity tetrads Gravity Waves Gregory Corso gyroscopes hacking quantum cryptographs Hagen Kleinert Hal Puthoff Hawking radiation Heisenberg Henry Stapp Herbert Gold Higgs boson Higgs field hologram universe Horizon How the Hippies Saved Physics I.J. Good ICBMs Igor Novikov inertial forces inertial navigation Inquisition Internet Iphone Iran Isaac Newton Israel Jack Sarfatti Jacques Vallee James F. Woodward James Woodward JASON Dept of Defense Jeffrey Bub Jesse Ventura Jim Woodward John Archibald Wheeler John Baez John Cramer John S. Bell Ken Peacock Kip Thorne Kornel Lanczos La Boheme Laputa Large Hadron Collider Lenny Susskind Leonard Susskind Levi-Civita connection LHC CERN libel Louis de Broglie Lubos Motl LUX Lynn Picknett M-Theory Mach's Principle Mae Jemison Making Starships and Star Gates Martin Rees Mathematical Mind MATRIX Matter-AntiMatter Asymmetry Max Tegmark Menas Kafatos Michael Persinger Michael Towler microtubules Milky way MIT MOSSAD multiverse NASA Nick Bostrum Nick Herbert Nobel Prize nonlocality Obama organized-stalking Origin of Inertia P. A. M. Dirac P.K.Dick P.W. Anderson Paranormal parapsychology Paul Werbos Perimeter Institute Petraeus Physical Review Letters Physics Today Post-Quantum Physics pre-Big Bang precognition presponse PSI WARS Psychic Repression qualia Quantum Chromodynamics quantum computers quantum entanglement quantum field theory quantum gravity Quantum Information Theory Quantum Theory RAF Spitfires Ray Chiao Red Chinese Remote Viewing retrocausality Reviews of Modern Physics Richard Feynman Richard P. Feynman Rindler effect Robert Anton Wilson Robert Bigelow Roger Penrose rotating black holes Roy Glauber Rupert Sheldrake Russell Targ Ruth Elinor Kastner S-Matrix Sagnac effect Sam Ting Sanford Underground Research Facility Sarfatti Lectures in Physics Scientific American Second Law of Thermodynamics Seth Lloyd signal nonlocality Skinwalker Ranch social networks space drive space-time crystal SPECTRA - UFO COMPUTER spontaneous broken symmetry SRI Remote Viewing Experiments Stanford Physics Stanford Research Institute Star Gate Star Ship Star Trek Q Stargate Starship Stephen Hawking Steven Weinberg stretched membrane string theory strong force gluons Stuart Hameroff superconducting meta-material supersymmetry symmetries telepathy Templeton The Guardian Thought Police time crystal time travel topological computers Topological Computing torsion UFO Unitarity unitary S-Matrix false? Unruh effect Uri Geller VALIS virtual particle Virtual Reality Warp Drive weak force Wheeler-Feynman WIMP WMAP WMD world crystal lattice wormhole Yakir Aharonov Yuri Milner

- November 2015(1)
- January 2015(1)
- December 2014(1)
- August 2014(2)
- July 2014(2)
- June 2014(2)
- May 2014(1)
- April 2014(6)
- March 2014(6)
- February 2014(1)
- January 2014(3)
- December 2013(5)
- November 2013(8)
- October 2013(13)
- September 2013(8)
- August 2013(12)
- July 2013(3)
- June 2013(32)
- May 2013(3)
- April 2013(6)
- March 2013(6)
- February 2013(15)
- January 2013(5)
- December 2012(15)
- November 2012(15)
- October 2012(18)
- September 2012(12)
- August 2012(15)
- July 2012(30)
- June 2012(13)
- May 2012(18)
- April 2012(12)
- March 2012(28)
- February 2012(15)
- January 2012(25)
- December 2011(29)
- November 2011(30)
- October 2011(39)
- September 2011(22)
- August 2011(41)
- July 2011(42)
- June 2011(24)
- May 2011(13)
- April 2011(13)
- March 2011(15)
- February 2011(17)
- January 2011(31)
- December 2010(19)
- November 2010(22)
- October 2010(31)
- September 2010(41)
- August 2010(30)
- July 2010(27)
- June 2010(12)
- May 2010(20)
- April 2010(19)
- March 2010(27)
- February 2010(34)