Local Interstellar Cloud and Galactic Superwave effects on the Earth

Illustration courtesy of Linda Huff  (American Scientist), Priscilla Frisch (U. Chicago)

The Local Interstellar Cloud.

Illustration courtesy of Linda Huff  (American Scientist), Priscilla Frisch (U. Chicago)

Though we may have already been inside what is known as the Local Interstellar Cloud for tens or hundreds of thousands of years, scientists have been discussing regional areas, aka “cloudlets”, of variable density that we may have entered into as recently as the 1990’s. For example, see this NASA story from Feb. 2002 or this NASA story from Jan 2003:

“Some of those cloudlets might be hundreds of times denser than the local fluff,” says Priscilla Frisch, an astrophysicist at the University of Chicago who studies the local interstellar medium. “If we ran into one, it would compress the Sun’s magnetic field and allow more cosmic rays to penetrate the inner solar system, with unknown effects on climate and life.”

A collection of articles with brief summaries about this phenomenon may be found here: http://www.susanrennison.com/Joyfire_Interstellar_Cloud_Index.php

There seems to be a large overlap here with Dr. LaViolette’s theories about the galactic superwave and the chain-reaction effect it would have on the solar system, Sun, and Earth, with past events being recorded in the Earth’s polar ice core record.  A few questions come to mind:

  1. How likely is it that the solar system’s movement through these variable density clouds will affect the Sun and Earth in a way similar to how a superwave has done in the past? Do you have any general thoughts on the significance of the Local Interstellar Cloud and its cloudlets with respect to its effects on our solar system/Sun/Earth/human bodies/minds?  Is this a real danger to be concerned with?
  2. Would such an event inject extra-terrestrial dust sufficient to produce increased concentrationsof cosmic dust indicators similar to those you found in ice age polar ice core samples?
  3. Is it fair to say that possibly some of the evidence for elevated cosmic ray activity found in ice age ice core samples could be evidence for this kind of “compression” of the Sun’s heliosphere/magnetic field/etc. by these cloudlets?


I will try here to answer Matt’s questions.

a) Regarding the first question about the incursion of this approaching interstellar cloudlet.  First we must ask how close is it and when will it actually be coming into our solar system?  In this regard, if you check carefully the news announcements made by astronomer Priscilla Frisch, she does not say that such a cloudlet has actually been detected, only that there is a high likelihood that cloudlets may be embedded in the Local Interstellar Cloud (i.e., within the Local Fluff) which have gas densities hundreds of times higher than the Local Interstellar Cloud average.  This Local Fluff is said to be 30 light years wide and travelling past us at 28 km per second.  So at that rate we will be going through it for the next 300,000 years.  If then such a cloudlet were as  close as 1 to 2 light years away from us, at this rate it would take 10,000 to 20,000 years before it reached us.  I would say that such an arrival date is a bit down the road and that there are more serious things to be concerned with before that time, such as the impending arrival of a galactic superwave which I expect a very great likelihood will occur in the next few centuries.  Unfortunately, it is not possible to predict a superwave’s time of arrival through satellite observation since a superwave travels towards us at the speed of light.  Hence when it has arrived, that is when we will see it.

In regard to getting a fix on any such cloudlet, as I understand it, our current satellite and spacecraft observations are not well enough refined to detect anything of this sort with any kind of certainty.  The energized plasma ribbon discovered by IBEX which is positioned at the outer boundary of the heliopause is an entirely different phenomenon.  In my opinion there is no relation of this to any so called impacting cloudlet.  I believe the ribbon to be a stationary phenomenon associated with the heliopause shock region.  The reason why it is so energetic is that our solar system was impacted by an intense volley of cosmic rays as recently as 11,000 to 16,000 years ago, with a very minor event possibly having impacted around 5300 years ago just prior to the emergence of Egyptian civilization.

Others do consider the possibility that there may be a connection between the ribbon and energization by the impacting interstellar wind.  Dr. Frisch believes that this high energy band could be the first sign of any change brought about by an interstellar cloud entering the heliosphere.  She says that the energetic neutral atoms in the IBEX Ribbon derive their energy from energetic ions in the solar wind and outermost regions of the heliosphere, and adjacent interstellar space.  But we have no direct measurements of energetic ions beyond the heliopause.  So all this is open to question.

However, suppose we assume for the moment that there is an impending threat from such a cloudlet incursion.  Would the solar and climatic effects be like that of a superwave?   Well we can do some calculations to find out.  Given that the Local Fluff (LIC) has a density of ~0.1 hydrogen atoms/cm3.  Above it was suggested that an approaching cloudlet inclusion could have a density hundreds of times greater than that in the Local Interstellar Cloud, hence a density of say around 20 to 50 hydrogen atoms per cubic centimeter.  In a recent personal communication with me, Dr. Frisch related that the gas density in a very tiny dust cloud could even reach as high as 1000 atoms/cc.  If we take this extreme example, we calculate a cloud density of around 1.5 X 10-21 grams/cm3.  An interstellar cloud incursion of this sort, I believe, would have a significant climatic effect and a significant solar effect.  But the most dangerous phase would likely last for several years, rather than for centuries or millennia as is often the case for the effects from a superwave.

I discussed a similar interstellar cloud incursion scenario in my 1983 PhD dissertation which is available in updated form on the Galactic Superwave CD at etheric.com.  Pages 94 – 96 of this dissertation, mention the 1950 paper by Fred Hoyle and Raymond Littleton which examined this scenario of climate effects resulting from the incursion of an interstellar cloud having a density of 10-21 g/cm3 advancing toward the solar system at 1 km/s.  They had proposed that energy released from the infall of this dust into the Sun would aggravate the Sun and increase its luminosity by up to 10% mostly in the ultraviolet.

The dense cloud that Frisch talks about as having an outside possibility of encounter with us would have had a gas density similar to the cloud that Hoyle and Littleton were considering but would be travelling almost 30 times faster.  So, there would be a far smaller chance that any of it would be swallowed by the Sun and cause a luminosity increase of the sort they consider.  Dr. Frisch related to me that astronomers today aren’t considering anymore the type of gas cloud encounter effect that Hoyle and Littleton discussed 60 years ago, that they are instead modeling the trajectory of today’s interstellar dust grains as they pass through the heliosphere.  They find that the smallest of the grains don’t make it in at all because the Lorentz force excludes these high charge-to-mass grains.  The largest grains are ‘gravitationally focused’ downwind of the Sun because of gravity and the relative Sun-cloud motion.  There aren’t very many of these large grains because their number falls off with size as a power law.

So we might expect a similar situation for the passage of the dense interstellar cloud we were considering above.  Because of its high velocity relative to the solar system (28 km/s), most of this gas would be gravitationally focused downwind of the Sun and hence would not become accreted by the Sun.  Hence any luminosity change would be quite minimal.  Hoyle’s cloud was moving very slow (1 km/s) and because of this the Sun would have accreted a very large quantity of its material.  In the case of this much faster cloud passage, let us suppose that the Sun accreted only 5% as much gas causing a solar luminosity increase of just 0.5%.  By comparison, solar luminosity normally varies by ±0.1% over the sunspot cycle.  So this proposed local interstellar cloud incursion would cause solar luminosity to increase around 5 fold over the amount that normally occurs during the course of a typical solar cycle.

But the fractional increase in UV would be much greater. We know that currently the solar UV is maximum at solar max due to increased solar flare activity at the solar cycle peak with the variation amounting to about 10% – 20% of the total irradiance variation, hence this UV change amounts to about a 0.01% change in solar luminosity.  Consequently, a 0.5% increase in UV of the sort expected from this hypothetical cloud encounter would cause an increase in UV 50 fold greater than occurs over the course of a solar cycle!  This begins to approach the UV excesses seen in a T Tauri flare star and could pose a serious hazard.

The luminosity increase from this cloud encounter would be much smaller in magnitude than the climatic impact I had considered in my dissertation for a superwave dust incursion event.  On page 96 of my dissertation I propose that the estimated cosmic dust influx that occurred during past ice age superwave encounters could have increased solar luminosity by 0.5% due to cosmic dust accretion by the Sun.  This is in the range of luminosity increase we estimated above for the approaching interstellar cloudlet.  However, in a superwave event I was noting that there would be a ten fold greater effect on the Earth’s radiation budget due to what I called the interplanetary hot house effect (light scattered from cosmic dust blown into the solar system by the superwave).  I had estimated a 5% increase in radiation to the Earth just from this effect.  Also I had indicated that there would have been a significant warming effect due to the reddening of the Sun’s spectrum caused by a dust cocoon that would have formed around the Sun, and also a cooling effect due to an increase in stratospheric dust concentration.

These cosmic dust effects, however, would be negligibly small in the case of an interstellar cloud incursion.  According to Dr. Frisch, about 1% of the mass of the cloud would be in the form of cosmic dust.  So in the case of the extremely dense cloud we discuss above, we are talking about a cosmic dust concentration of around  10-23 grams/cm3 invading the solar system.  This would cause about a 5 % increase of the present interplanetary dust concentration, which is rather insignificant.

So how much of a climatic effect would a 0.5 % increase in solar luminosity have on climate?   Scientists have searched for whether there may be a solar cycle climatic effect due to the ±0.1% variation in solar luminosity over the 11 year solar cycle.  Generally they find there to be no impact on global climate.  However, a recent study coming out of the Imperial College of London and Oxford has found that locally in Europe winter weather is affected, with winters being warmer at the time of a sunspot cycle peak (and solar cycle luminosity peak).  So far we haven’t seen this to be the case with the current solar cycle since the European winter has been particularly cold this year, but we will see what happens next year.  It is difficult to extrapolate for the case of this interstellar cloudlet, but definitely a 5 fold increase in solar luminosity from the solar cycle peak should make winters in Europe far warmer than we can remember.  Maybe good from the standpoint of saving on heating bills.  But I would expect there would also be some global effect with a luminosity increase this large.  It is likely that it would worsen the past global warming trend and also reverse the current climatic cooling trend that some associate with the recent general reduction in the Sun’s flaring activity.  This could accelerate polar melting with its associated sea level rise and could cause increased drought in the lower latitudes (e.g., Africa, southwestern U.S., etc.).

However, it is likely that this solar luminosity increase would not last for many years.  Solar flare activity is tied to matter infall to the Sun.  So we would expect that solar flare activity should dramatically increase and it might occur continuously, even during solar cycle minimum.  We could likely expect a repeat of the 1859 Carrington Event, which if such occurred it could wipe out all satellite communication, down the electrical power grid on a global scale, and injure electrical appliances, plunging society back to the horse and buggy days.  The U.S. National Research Council report warning of such a scenario is discussed here.

Also last year I published a paper demonstrating that the mass extinction of megafauna at the end of the ice age was likely due to extinction level solar proton events bombarding the Earth.  This is discussed in the following press release.  I don’t believe that the Sun would reach the level of activity that it had at the end of the ice age which is evident from NASA studies of lunar rocks.  The reason is that the superwave incursion proposed to have been occurring at that time would likely have surrounded the Sun with a dust shroud that would have reflected light back onto the Sun and greatly participated in aggravating the Sun’s level of flaring activity.  No such dense dust shroud would be present during the proposed cloudlet incursion.  But I wouldn’t entirely rule out the possibility that the increase in solar activity associated with the proposed cloudlet incursion might produce a super solar flare of a size capable of producing a solar proton event of such large a magnitude.

If solar flare activity were to substantially increase, the increased solar cosmic ray bombardment would also cause increased destruction of the ozone layer.  The polar ozone holes would likely expand to lower latitudes.  A reduction in ozone protection coupled with a 50 fold increase in solar UV output would be disastrous.  People would have to put sun block on any time they go out and would have to carry an umbrella with them to shield the Sun.  Even if humans took precautions, when they ventured out into the Sun, would animals also take precautions and come out only at night?  What about livestock?  A large increase in the UV level could have a substantial impact on the food supply.  There would be some negative effect on plant life, but would not nearly be as significant a hazard as it would be for animal life.

The only good thing about the elevation of solar activity is that this would increase the force of the solar wind and expand the heliopause outward, thereby helping to force this cloudlet away so that it travels around the heliopause rather than through it.  So the effects of the cloudlet incursion would likely diminish after a few years as the Sun’s activity picked up.  Thereafter, some lower long-term equilibrium level would likely be reached between the Sun’s level of flare activity and the rate of cloudlet gas influx.  Currently, due to the lower than normal solar activity, the heliopause sheath is pushed in closer to the Sun.  So the solar system is currently more vulnerable to a cloudlet incursion.

b) Regarding your second question, to compare this prospective increase in interstellar dust influx with the increases that occurred during the ice age (estimated from  my analysis of Greenland polar ice), we would have to first know what is the cosmic dust concentration in this cloudlet.  Above we estimated that this cloud would have a cosmic dust density of around 10-23 grams/cm3 which is 5% of the current interplanetary dust density.  So incursion of the cloudlet dust would not come anywhere close to the scenarios I describe for a superwave arrival which would create dust concentrations over 1,000 times greater than what would be supplied by this cloudlet.

c) Regarding your third question, whether the evidence for elevated cosmic ray intensities recorded in the ice age portion of the polar ice record could have been due to past cloudlets compressing inward the heliopause, I don’t think that these could be attributed to cloudlet encounters.  I still think that galactic superwaves are the best explanation for these recurring beryllium-10 peaks found in the ice record.  It is a point of debate whether a compressed heliopause sheath is due to the impact of a cloudlet or simply to a reduction of outward solar wind pressure.  I think that it is mostly dependent on the latter.  Keep in mind that the heliopause is always impacted by the interstellar wind, whether a cloudlet is present or not, and its position on this upwind side is largely determined by a balance between the inward interstellar wind pressure and outward solar wind pressure.  The presence of an interstellar cloudlet could increase the inward pressure, but the other side of the equation is the level of solar activity.

I understand that some astronomers are presently alarmed to find that the outer boundary of the heliopause is as close as 1000 AU with the inner boundary at ~70 AU.  The  heliopause sheath would be far more compressed during a superwave arrival.  As I pointed out in my dissertation, during a superwave event, similar to those that appear to have occurred during the last ice age, the inner boundary of the heliopause sheath could have become so greatly compressed that its upwind side would have been positioned between the orbits of Mars and Jupiter, hence around 3 AU.  This would have allowed easier entry of vaporized cosmic dust.

d) Just to add a few more things in regard to the first question.  A superwave event would pose a far greater climatic hazard to the Earth and humanity and far more prolonged compared to the hazard that this cloudlet encounter would pose.  The presence of this interstellar cloudlet could only worsen the effects that a superwave would have on our solar system since it would provide a greater supply of gas that could become blown into the solar system by the superwave.  Since the superwave would compress the heliopause to a far greater extent than would otherwise occur, this material would enter far more easily than it would in the absence of a superwave.

Some speculate whether military forces around the world have been planning for such an event and this is why they have been building vast underground facilities and fallout shelters.  A few places that come to mind are the facilities beneath the Denver airport, and many others are rumored to have been outfitted in the U.S.  I was personally told about an abandoned gold mine which has been outfitted with living space far below ground level with space being leased out to people with money.  The Norwegian government has been building a vast network of underground shelter facilities, or arks, as well as many other governments.  Project Camelot has an interesting page on this.  Some industrialists may also be in the know.  Richard Branson has built 90% of his Virgin Galactic New Mexico Spaceport underground.  See this AP news article and this LA Times article.  Many in the area had wondered why so much of his construction was being built underground considering that land in the area is comparably inexpensive.  Could the CEO have been tipped off about the possible occurrence of a future catastrophic event?

So the question that arises is whether all of this may have been inspired from fears about this incoming interstellar cloudlet and kept secret so as not to cause financial panic.  By as early as 1963, the U.S. military reportedly had deployed satellites around all the inner planets and a few of the more distant outer planets at a time when NASA had only just announced sending a spacecraft to Venus; see Secrets of Antigravity Propulsion, page 396.  The military has always been several steps ahead of NASA in solar system surveillance.  Have they known about this cloudlet coming and have they for a long time been making preparations?  Could they have access to information that is presently unavailable to the astronomical community?  Or could it be they are preparing for a superwave arrival rather than an interstellar cloudlet arrival?  This is left to speculation.

Dr. Frisch has told me that our knowledge of the local interstellar environment is continuously increasing.  She says that we are getting more and better data on the Local Fluff, including high spectral-resolution Hubble Space Telescope data and measurements of the interstellar magnetic field in the Local Fluff.  She feels that if there is a tiny dense cloud within 30 lightyears, we might be able to figure out a way to identify it in the next several years.  Of course, even if such a cloud were at our doorstep, say 140 astronomical units (AU) away, we would have plenty of time before it arrived.  At the current 28 km/s velocity such a cloud would move about 5.7 AU per year.  So we would have 25 years before it reached Earth’s orbit at 1 AU.

Finally, some who are outside of the astronomical community and presumably are not themselves scientists, believe that the Local Interstellar Cloud may be hiding a planet X or brown dwarf that is approaching the solar system.  I find this totally implausible.  For one thing the amount of dust between us and the far end of this local fluff is so insignificant that it would not obscure such bodies.  Furthermore if such a body were present it would have to be detected with an infrared telescope since the intrinsic temperature of a brown dwarf or planet drifting through interstellar space that far from our Sun will be no more than 120° above absolute zero (i.e., minus 150° C).  Such objects can only be detected at infrared wavelengths in the range of 2 to 50 microns and such wavelengths are not affected by dust.  They go right through completely unattenuated.  In addition, the amount of obscuration is very low even at visible wavelengths.  If we were to suppose that this presumed planet were 1 light year away, the dust column density obscuring it would be only 10-8 g/cm2 which is 1000 times less than the amount of dust between us and the Galactic center.

Paul LaViolette March 2, 2012

Changes in the angular size of the Crab pulsar

Related to the earlier posting about the gamma ray flare observed in the Crab remnant, it is worth mentioning the findings of astronomer A. G. F. Brown published in 1976: http://adsabs.harvard.edu/full/1976MNRAS.176P..53B

Brown made interplanetary scintillation observations of the Crab pulsar at a radio frequency of 81.5 MHz and found that over a four year period between 1971 and 1975 the angular diameter of the Crab pulsar radio source increased over three fold from 0.2±0.1″ of arc to 0.7±0.1″ of arc.  The angular diameter of a pulsar’s radio image is determined by the amount of scattering its radio signal experiences as it encounters electrons in the interstellar medium.  A larger radio image diameter implies greater scattering which in turn implies greater interstellar electron concentration.  Brown ruled out changes in the solar plasma as being responsible for the change.  He also finds it unlikely that it is caused by changes in scattering in the immediate vicinity of the pulsar.

I would suggest that these changes in interstellar medium scattering are produced by the superwave that is now passing through the Crab nebula’s vicinity and which can change the electron density encountered along the line of sight to the Crab pulsar.

Explaining the ring-like waves of X-ray emission around the Crab pulsar

X-ray map of the inner portion of the Crab Nebula.

Credit: NASA/CXC/MSFC/M.Weisskopf et al & A.Hobart

Click below to view video

In a recent comment, gmagee inquired about the rings of X-ray emission that are seen to be expanding away from the Crab pulsar and whether this activity might be more likely interpreted as being intrinsic to the pulsar wind rather than to an impacting galactic cosmic ray volley.  This ring motion was reported in the news today, one story appearing in PhysOrg (http://www.physorg.com/news/2011-05-crab-nebula-action-case-dog.html).
In answer to this question, I would respond, no.  The expanding ring of emission is most likely produced by the superwave, not by the Crab pulsar.  Much of the misconception on interpreting this phenomenon concerns the all too common belief that the Crab pulsar lies near the geometrical center of the Crab Nebula.  This misconception is perpetuated not only in technical papers but in media news reports such as the above cited report.  To the contrary, as I had proposed in chapter 5 of my 1983 Ph.D. dissertation (see in particular pp. 179 – 180 of the dissertation update), a careful analysis of the kinematics of the Crab pulsar and of the high velocity filaments traveling outward from the explosion center shows that the Crab pulsar is most likely situated at the forefront of the Nebula (4 – 5 light years from the center) and is traveling almost directly towards us at ~1500 km/s (2° angle deviation from our line of sight).  Only when viewed in projection from our vantage point does it “appear” to lie at the geometrical center of the Nebula.  I would rather not go into the details of this explanation here since it is rather extensive, but refer readers to my dissertation.  Also the peripheral nebular placement of the Crab pulsar is to a much less extent dealt with on page 74 of Decoding the Message of the Pulsars.  Other reasons why the pulsar is not the source of the cosmic rays energizing the Crab Nebula are given in my dissertation, in my 1987 Earth, Moon, and Planets paper, and in chapter 10 of my book Earth Under Fire.

This high velocity scenario I am proposing suggests either that 1) the Crab supernova explosion was asymmetrical in such a manner as to eject its central neutron star outward in our direction, or 2) that the Crab neutron star progenitor was part of a close binary and that its partner star destroyed itself in the explosion and simultaneously ejected and propelled its neutron star partner outward along the pulsar’s current trajectory.  Examples of such hyperfast pulsars are PSR B1508+55 and B1757-24.  An example of B1757-24 is shown in the image below.  If this were the Crab pulsar, we would be far off to the right viewing the pulsar and its nebula face on.

Pulsar B1757-24 in the constellation of Sagittarius

As I pointed out 28 years ago in my dissertation, an impacting superwave would create a bow shock region around the Crab pulsar.  Hence waves of superwave cosmic rays hitting this shock region, travelling way from us into the plane of the sky at the Crab location, would give the appearance to us of concentric rings of X-ray emission expanding away from the Crab pulsar as they proceded in the anticenter direction to the rear of the pulsar.  The shock front generating these moving X-ray rings (in the vicinity of the Nebula’s luminous wisps) may not necessarily correspond with the shock region that I have suggested is responsible for producing the gamma ray synchrotron emission flares.  There may be several such emission nodes in the supernova shell that would be emitting high energy radiation.  But they may not necessarily all be at the same distance relative to a given cosmic ray front in the superwave.  So although the Crab pulsar X-ray rings and the gamma ray flares are both being energized by superwave cosmic rays, they would not necessarily be impacted simultaneously by a given front.  This would explain why no correlative results are seen for the two emission phenomena.

[I would like to point out here that in giving the above explanation I am not constructing a model a posteriori to  fit the data.  My model was proposed 28 years ago and I see no reason to change it.  I am simply explaining how this apriori proposed model would produce the observed results.  In short, findings which astronomers say seem very mysterious, are seen not really to be that mysterious after all.]

Crab Nebula flares again

Update to our previous posting about gamma ray flares being observed from the Crab Nebula, ongoing evidence that the nebula is being impacted by superwave cosmic ray electrons.

The Crab Nebula in the constellation of Taurus

BBC News story
PhysOrg.com news story

On April 12th, 2011, the Crab Nebula emitted a gamma ray flare lasting six days that was five times more intense than any of the others that were previously observed and 30 times brighter than the nebula’s normal gamma ray intensity.  On April 16th an even brighter flare occurred but faded out over a period of two days.

As stated in the previous post, I had predicted this high energy variability of the Crab Nebula almost 30 years ago in my Ph.d dissertation and in a subsequent 1987 journal publication.  It is only recently with the launching of the Swift gamma ray telescope that regular measurements of the Crab Nebula at gamma ray frequencies have been made possible.  As I proposed then, the Crab Nebula’s unusually strong luminosity does not originate from its associated neutron star but from a volley of galactic cosmic rays that are striking it face on.  Since this volley can change its intensity quite rapidly, so too the intensity of the Crab Nebula’s emission will change in step.  This would be most noticeable at gamma ray frequencies since the very high energy cosmic rays producing the gamma synchrotron radiation lose their energy quite rapidly, hence intensity changes become more noticeable than at, for example, optical frequencies where the lower energy cosmic rays have lifetimes of many years and hence smooth out any flare activity.

Current claims that the flares are attributable to the Crab’s neutron star are entirely off the mark.  In fact, there is no evidence of any correlated activity in the immediate vicinity of the Crab pulsar.  For example, NASA scientist Martin Weisskopf who was part of a team observing the pulsar with the Chandra X-ray telescope is quoted as stating:

 “Thanks to the Fermi alert, we were fortunate that our planned observations actually occurred when the flares were brightest in gamma rays,” Weisskopf said. “Despite Chandra’s excellent resolution, we detected no obvious changes in the X-ray structures in the nebula and surrounding the pulsar that could be clearly associated with the flare.”

Astronomers are currently at a loss to explain what they are seeing simply because they are ignorant of the superwave theory.  Do your part and inform them.

Is the Crab Nebula being energized by a superwave?

The Crab Nebula in the constellation of Taurus. Courtesy of NASA

In his 1983 Ph.D. dissertation, Paul LaViolette presented the novel theory that most of the radiation coming from the Crab Nebula is not due to cosmic ray emission coming from the Crab pulsar, but rather is produced by a cosmic ray electron volley (a galactic superwave) that is currently propagating toward the galactic anticenter and impacting the remnant face on.  He theorized that these superwave cosmic rays are currently being captured by the magnetized plasma forming the Crab remnant which causes them to emit synchrotron radiation, thus illuminating the nebula.

Recent observations of the occurrence of gamma ray flares in the nebula help support LaViolette’s theory.  Measurements made with the Fermi Gamma Ray Space Telescope have shown that in February 2009 the Crab nebula gamma ray intensity rose by a factor of four over a 16 day period before subsiding back to background levels.  Also on September 2010 its gamma ray intensity rose six fold over a 4 day period.  Details of this are reported in the February 2011 issue of Science magazine.

The cosmic rays producing this gamma emission have such high energies that they cannot travel further than 0.1 light years.  So if the cosmic rays had originated from the Crab pulsar, all of their emission would have had to come from a region 0.2 light years in diameter centered on the pulsar.  However, the Crab pulsar may be ruled out as being the source of these cosmic rays since observations with the Jodrell Bank radio telescope have shown that during these flares there was no change in the pulsar’s radio flux intensity, pulse shape, or rate of pulse period increase; see report in the Astronomer’s Telegram.

Furthermore the astronomers who studied these flares were puzzled by the finding that these gamma flares were necessarily produced by cosmic ray electrons having energies of 10 quadrillion (10 million billion) electron volts.  These were the highest energy cosmic rays ever observed that were able to be traced to a specific astronomical object.  The problem is that astronomers have no idea how a pulsar, like that associated with the Crab nebula, could have accelerated cosmic ray electrons to such a high energy, and have accomplished this acceleration so rapidly.  They say the discovery challenges all theories about how cosmic ray particles are accelerated (see story in Science Daily).

This mystery is easily solved if the cosmic rays producing this gamma ray emission originated from the energetic cosmic ray source residing at the center of our galaxy and are part of a superwave currently impacting the Crab remnant, as suggested by LaViolette.  The flare would indicate that we happened to observe the remnant at a time when it was being impacted by a higher than normal density of ultra relativistic galactic cosmic ray electrons.  As noted from observations of active galactic nuclei, cosmic ray emission intensities can vary considerably.  With this model, the relativistic electrons producing this gamma emission would not be coming from a point source in or near the nebula but would be entering diffusely over the entire nebula.  In the standard explanation, the high energy electrons would be limited to a 0.2 light year diameter region centered on the pulsar before exhausting all of their energy.  According to the superwave theory, these electrons would produce emission covering a much larger region most probably centered on the remnant’s X-ray emission region that would not necessarily be centered on the pulsar.  It would most likely coincide with the X-ray emission that is located to one side of the pulsar.

The other unusual finding reported lately is that the Crab Nebula has been gradually dimming.  Sandberg and Sollerman have found that the optical and infrared emission from the Crab Nebula has been gradually declining in intensity by about 0.7 ± 0.4 % per year (2.9 ± 1.6 mmag/yr) over the past 20 years.  From 2006 to 2009 an even larger increase of about 2% per year is indicated.  Also more recently, besides the discovery of the gamma ray flares, scientists have found that the gamma ray emission intensity from the Crab Nebula has been declining quite rapidly.    Observations with the Fermi gamma ray telescope indicate that since the summer of 2008 its gamma ray intensity has declined about 7%; see stories in e Science News and Before It’s News.  In fact, they have found that its gamma emission has brightened and dimmed three times since 1999 on a timescale of about three years.

Astronomers have been puzzled by this variability since the standard theory predicts that the Crab pulsar radio intensity should show a corresponding decline in intensity, or corresponding variations, but as mentioned above, the pulsar’s intensity instead remains relatively constant.  The superwave theory is not similarly troubled by such intensity variations.  In fact, long-term changes in intensity would be entirely expected as the superwave propagates through the remnant.  Eventually, sometime perhaps within the next millennium or so, after the superwave has completely passed through the nebula on its journey away from the GC, the Crab nebula will cease to shine as it now does.  Its source of illumination will essentially be shut off.

The Fermi Gamma Ray Bubbles: Evidence for the superwave cosmic ray propagation theory

Gamma ray bubbles seen toward the Galactic center extracted by the Harvard team from Fermi telescope data

MSNBC interview on how the superwave theory explains the recently announced Fermi bubbles:  http://www.msnbc.msn.com/id/21134540/vp/40152463#40152463

The team of Harvard University scientists who discovered the Fermi bubbles (Su, Slatyer, and Finkbeiner), lean toward the interpretation that the gamma ray emission from the bubbles is synchrotron radiation created by ultra relativistic cosmic ray electrons emitted from the core of our Galaxy.  See the draft of their paper posted at: http://arxiv.org/pdf/1005.5480v3.  In general, cosmic ray electrons that have energies high enough to produce gamma ray synchrotron radiation have relatively short lifetimes, meaning that they radiate away their energy in a period of time that is short by astronomical standards.  On this basis, the Harvard team estimates that these cosmic ray electrons have a “cooling time” of on the order of 10^5 – 10^6 years. In other words, we may conclude that they have been in flight for no more than this long and possibly even a shorter period of time.  Thus the bubble would have an age less than this upper limit value, and as I show below, we may determine that the main bubble likely has an age of ~40,000 years.

The Harvard team notes that the outer boundary of the bubble is quite sharply defined and that its interior surface brightness is relatively flat (i.e., uniform).  They state this indicates a sharp increase in cosmic ray intensity at the bubble walls.  They consider various formation mechanisms and as one likely mechanism have suggested that the emission is the result of a sudden outburst of cosmic ray electrons from our Galaxy’s core; see p. 39 of their preprint.  They appear to consider a subrelativistic blast wave type model for the outward propagation of the cosmic rays to the bubble perimeter.  However, they note several difficulties in the explanation.  The short lifetimes of the cosmic ray electrons would require that they move relatively rapidly outward from the core and second the uniformity of the bubble emission suggests that little time has been available for the nonuniformities to develop in this cosmic ray “wind” due to diffusive effects.

Both of these time limit difficulties are solved if the cosmic ray electrons are assumed to propagate radially away from the galactic core in all directions at very close to the speed of light.  In other words, the problems are solved if we assume that these cosmic rays propagate as a superwave.

Illustration of a Galactic superwave outburst excerpted from the video Earth Under Fire

A sudden outburst propagating relativistically would also explain why the bubble maintains a sharp outer edge.  The synchrotron gamma ray emission we are seeing from the bubble would come from cosmic rays that had been scattered away from their rectilinear outward flight, either by collisions with photons, ionized gas, or due to being captured into spiral orbits around magnetic field lines.  In this way their emitted gamma ray synchrotron beams which are directed into very narrow cones aimed in the forward direction of travel (relativistic beaming effect) become visible to us at our near perpendicular viewing direction.

Another difficulty that the Harvard team points out is their interpretation that the bubble is produced by cosmic ray jets emitted from the Galactic core.  The point is that you wouldn’t expect to have two jets exactly diameterically opposed to one another.  The question they have is why is one jet opposed to the other?  This problem too is resolved by the superwave model.  The jet model assumes that cosmic rays are emitted as a confined beam in one direction whereas the superwave model assumes isotropic emission forming an spherical superwave shell that expands in diameter as the superwaves move outward.  The reason why we see two opposed bubbles is because the cosmic rays aimed upward and downward relative to the galactic plane escape more freely and with greater intensity due to the fact that the torroidal magnetic field surrounding the galactic core does not impede the outward flight of the cosmic rays; see diagram below.

Illustration of the unimpeded propagation of cosmic rays in the Galaxy's polar direction.

The diffuse gamma ray halo (see image below), which was discovered in 1997 and is seen at high galactic latitudes and in all directions around the galactic disc, is also generated by outward moving superwave cosmic rays.

Galactic map showing the Milky Way's diffuse gamma ray halo. Abscissa indicates galactic longitude. (courtesy of D. Dixon, D. Hartmann, E. Kolaczyk, and NASA)

I had pointed this out in 2001 in an updated edition of my Ph.D. dissertation entitled Galactic Superwaves and their Impact on the Earth,

and had described this diffuse gamma emission halo as being additional evidence that we reside within a shell of outward propagating superwave cosmic rays.  In describing this emission, astrophysicists in 1997 had noted that the cosmic ray electrons producing this gamma emission have a relatively short lifetime and hence could not be propagating very far from their point of generation.  Yet no sources were readily apparent and so they wondered where did these high energy cosmic rays originate from.  Dr. Dixon, one of the discovery team stated:
“What is so curious about the newly discovered gamma-ray cloud is that the photons do not appear to be coming from any compact sources, like other galaxies or a black hole.  The reason this is interesting is that there isn’t any obvious source for these gamma rays, based on astronomical observations in other wavelengths of light.  That is, as far as we can tell using other telescopes, the space around our galaxy is rather empty of the kinds of things which we would expect to generate gamma rays in the observed brightness distribution.”

In 2001 I believe I was the first to point out that the cosmic rays producing this diffuse gamma emission had a galactic core origin.  I had pointed out that the cosmic ray origin problem noted by the astronomers discovering the halo could be resolved if we assume that this gamma emission is produced by superwave cosmic rays that have been in flight on the order of 10^4 years or so.

The Fermi bubble emission is part of this omnidirectional  diffuse gamma ray halo, but it is brighter than the rest due to the fact that the cosmic ray electrons traveling in the general direction of the Galaxy’s poles encounter less resistance from the Galaxy’s core magnetic field.  So in suggesting that the Fermi gamma ray bubbles might be produced by cosmic ray electrons recently emitted from the Galactic core, the Harvard team confirms my earlier proposal about the origin of the cosmic rays energizing the Galaxy’s gamma ray halo.

The Harvard team also acknowledges the existence of fainter larger gamma ray bubbles outside the main inner gamma ray bubble, noting that this points to previous events.  They acknowledge that this indicates ongoing cyclic activity on a shorter timescale than had previously been acknowledged.  This supports another aspect of the superwave theory that superwaves are recurrent.  I have proposed a rather short recurrence period, with major explosions of our galactic core occurring on cycles of 12,000±1000 and 25,000 ±2000 years.

To estimate an age of the gamma ray bubble on the basis of the superwave hypothesis, consider the diagram below.

Estimating the age of the bubbles on the basis of the superwave hypothesis.

Knowing that the outer edge of the bubbles extend 50° above and below the Galactic plane, and knowing that we lie ~23,000 light years from the Galactic center, geometry tells us that the outer edges of the bubbles lie about 27,000 light years from the Galactic plane.  The hypotenuse distance from the upper edge to us calculates to be 36,000 light years.  So Adding 27,000 years for the superwaves speed of light flight upward, plus 36,000 years for the generated gamma emission to reach us gives that the outburst left the core 63,000 years ago.  Considering that the same superwave took 23,000 years to reach us, we conclude that the superwave event associated with this gamma ray bubble passed us 40,000 years ago (63k – 23k). When we look at the beryllium-10 ice core record from Vostok, Antarctica, we see that one of the largest magnitude and longest lasting Be-10 peaks is centered at 40,000 years before present; see posted diagram below. In terms of cosmic ray output, it greatly surpasses the smaller ones that passed us at the end of the ice age around 11,000 to 16,000 years ago, these more recent lower magnitude events presumably being responsible for bringing the ice age to a close.

Earth's cosmic ray exposure based on the Vostok beryllium-10 deposition rate record

We see a similar double lobe phenomenon going on in other galaxies.  For example, X-ray and radio emission lobes are also seen flanking the core of radio galaxy Centaurus A, the closest galaxy to us exhibiting activity in its core; see the photo below.

Edge on active galaxy Centaurus A showing double lobes of X-ray and radio synchrotron emission

Emission lobes are also seen flanking the galactic core of the edge-on spiral M82; see the Hubble Space Telescope photo below.

The galaxy M82 showing double emission lobes

The discovery of the Voorwerp gas cloud also supports the superwave theory that core explosions can turn on and off relatively rapidly, i.e. within tens or hundreds of years.  In this case the Voorwerp is observed to be illuminated by light from a quasar in the core of the background spiral galaxy IC 2497 which currently is seen to have a quiescent core; see photo below.  Estimates of the light travel time from the galaxy’s core to Voorwerp and then from this cloud to us suggest that galaxy’s core quasar shut off and returned to quiescence sometime within the past 70,000 years.

Voorwerp cloud in the foreground is illuminated by light from a quasar in the core of background spiral galaxy IC 2497 that has since shut off. (photo courtesy of WIYN/William Keel/Anna Manning)

See story at: http://www.physorg.com/news/2010-11-cosmic-curiosity-reveals-ghostly-dead.html

The shells imaged recently in galaxy NGC 474 also are evidence of recurrent superwaves propagating isotropically from the cores of active galaxies; see photo below.

Galaxy NGC 474 with surrounding luminous shells. (Courtesy of Mischa Schirmer)

In conclusion, I believe that the discovery of the gamma ray bubble provides strong confirmatory evidence for the superwave theory and is corroborated by the ice core evidence.

For information on past confirmations of the superwave model of relativistic, rectilinear cosmic ray propagation see:
(Prediction No. 2).

Also see the press release at:

(You will notice I don’t use the word “black hole” but instead use the more theory-neutral term “galactic core.” I state my opinion about black holes at the following link: http://starburstfound.org/mother-star-gravity-well/.)

Paul LaViolette, Ph.D.
The Starburst Foundation