Intergalactic Gas Heating Over Time

Does the observation that intergalactic gas clouds are heating over time support the SQK prediction that more gas should nucleate over time from the underlying etheric matrix? Does an increasing density of gas over time in the intergalactic medium imply higher gas temperatures?


Response from P. LaViolette:

I would answer the above question as follows:  Subquantum kinetics (SQK) is in agreement with the findings of this study.  The observed progressive increase temperature of the intergalactic gas can partly be attributed to neutrons continuously materializing in space which shortly after their appearance undergo beta decay into protons and relativistic electrons.  These materialization decay products in turn provide the energy source that heats the WHIM (warm hot intergalactic medium).  WHIM at high redshifts appears cooler to us than the WHIM temperature at more moderate redshifts due to the greater amount of tired-light energy loss that has affected photons coming to us from that earlier epoch.  The group that performed this study, Becker et al., suggest that the temperature rise is due to the heating effect of the radiation output from quasars and active galactic nuclei.  This is indeed a contributing factor, but one that supplements the ongoing energy being released from continuous matter creation.  We must also consider that subquantum kinetics predicts that galaxies are growing in size over time and developing increasingly massive and energetic active galactic cores in increasing numbers.  So this would be another factor contributing to the progressive rise in WHIM temperature.  In subquantum kinetics the rise in WHIM temperature may be attributed to the ongoing violation of energy conservation occurring throughout the universe and this is permissible since SQK maintains that the universe operates as an open system.

The study of Becker et al. found that between redshift era z = 4.4 and redshift era z = 2.05 the intergalactic gas temperature increased from about 8,200 to 13,900 degrees Kelvin, hence a 70% rise.  These quoted temperatures are averages of their two models (γ = 1.5 and γ = 1.3).  In the tired-light cosmology of subquantum kinetics, this temperature increase occurs between an epoch dating 22.8 billion years ago and an epoch dating 15 billion years ago, hence over a period of about 8 billion years.  According to the Stefan-Boltzmann law, energy density increases according to the fourth power of temperature (E = k T4).  Hence the energy density of intergalactic space increased 7.8 fold during this period.

These temperature observations, however, do raise serious doubts about the big bang theory and its expanding universe hypothesis, something not mentioned in this news release.  In the big bang theory the time elapsed between the z = 4.4 epoch and the z = 2.05 epoch amounts to just 1.8 billion years.  Moreover since the big bang theory predicts that comoving space expanded 15% in this interval, which means that the big bang theory requires a 9 fold increase in energy input to produce this 7.8 fold increase in energy density.  So, in the big bang cosmology this energy density increase would have to be occurring five times faster as compared with the subquantum kinetics cosmology.

Becker et al. suggest that galactic core explosions are the energy source causing this heating.  However, as I point out in the fourth edition of Subquantum Kinetics, applying the most liberal assumptions it would take at least 25 billion years for galactic core explosions to provide the required energy input, whereas the big bang allows less than 2 billion years for this to take place.  So the observation that the temperature of the intergalactic medium has risen as much as it has places the big bang theory and standard cosmology in a rather difficult position.

Original posting May 23, 2011, updated on February 22, 2013

Bookmark the permalink.

Leave a Reply

Your email address will not be published.