Hi higgsy,
I appreciate your questions about my posts..
I respect you, and your opinions, you give the impression of being well qualified
and familiar with physics, maths and cosmology at university level or beyond
so fundamentally you have me at a disadvantage with some things where I have no expertise, I have to read, search, and evaluate what I can through my own credibility filter, you seem to have a thorough and competent understanding of the latest integrated theories of physics and cosmology.
But so does Pierre-Marie Robitaille, he has qualifications, a professor at a university, has published many peer-reviewed papers, he has a demonstrated practical knowledge of real world diagnostic imaging and data analysis, he does not believe the CMB is evidence of the Big Bang.
https://www.youtube.com/watch?v=i8ijbu3bSqI
https://www.youtube.com/watch?v=3Hstum3U2zw
"Professor Robitaille joined the Department of Radiology in 1989. At the time, his research centered upon spectroscopic methods, with a focus on the experimental and theoretical aspects of Nuclear Magnetic Resonance and Magnetic Resonance Imaging (MRI). He devoted considerable attention to NIH funded spectroscopic analysis of in-vivo cardiac metabolism in the normal and failing heart, using both 13C- and 31P- NMR methods. He also focused on the development of new instrumentation for MRI. This included the design and assembly of the first torque compensated asymmetric gradient coil."
"From 1995-2000, Professor Robitaille was responsible for conceiving and assembling, at OSU, the world's first ultra high field clinical MRI system. This 8 Tesla/80cm MRI system was utilized to acquire many of the highest resolution images in existence. At the same time, early results with this instrument prompted a reconsideration of RF power requirements in MRI and of signal to noise. In turning his attention to these problems, Professor Robitaille initially sought to consider NMR is a "thermal" process. In the early days of this modality, the T1 relaxation time was also known as the "thermal" relaxation time. This would lead to a detailed study of Kirchhoff's Law of Thermal Emission, a topic on which he has subsequently published extensively."
"Kirchhoff's Law stands at the very heart of spectroscopic analysis, not only in medicine, but also in fields as seemingly remote as astronomy. For Professor Robitaille, revisiting Kirchhoff's Law of Thermal Emission has resulted in questioning many established ideas in astronomy, including the origin of the microwave background and, most importantly, the nature of the Sun itself. That is because the standard model of the Sun, relies on the validity of Kirchhoff's Law, in order to justify a gaseous state. Conversely, if Kirchhoff's Law is not valid, then the Sun cannot be a gaseous in nature. Along these lines, Professor Robitaille has recently advanced forty lines of evidence that the Sun is comprised of condensed matter."
http://radiology.osu.edu/10717.cfm
Robitaille has been often attacked in a most unscientific way, pure ad hominem slander,
thankfully human reason can prevail in spite of statisticians, opinion polls, and chook yard politics....
If there are errors of understanding in the nature of the sun, in the physics of the electromagnetic wave, including light and other radiation, in how electric and magnetic fields may work in space, and even errors in the physics constants, then we have a lot of homework to do...
I'm an artist with an interest in life, science and truth generally,
so in a way I might be representative of the 'general public',
ie many people who admire modern scientific achievements and
technologies, like computing, space and other modern wonders,
..... but....have a 'feeling' that there is a systematic failure somewhere...
This guy says it better than I could:
"During my professional career, all I have seen is failure. A failure of particle physicists to uncover a more powerful mathematical framework to improve upon the theories we already have. Yes, failure is part of science – it’s frustrating, but not worrisome. What worries me much more is our failure to learn from failure. Rather than trying something new, we’ve been trying the same thing over and over again, expecting different results."
"The idea of naturalness that has been preached for so long is plainly not compatible with the LHC data, regardless of what else will be found in the data yet to come. And now that naturalness is in the way of moving predictions for so-far undiscovered particles – yet again! – to higher energies, particle physicists, opportunistic as always, are suddenly more than willing to discard of naturalness to justify the next larger collider.
"Now that the diphoton bump is gone, we’ve entered what has become known as the “nightmare scenario” for the LHC: The Higgs and nothing else. Many particle physicists thought of this as the worst possible outcome. It has left them without guidance, lost in a thicket of rapidly multiplying models. Without some new physics, they have nothing to work with that they haven’t already had for 50 years, no new input that can tell them in which direction to look for the ultimate goal of unification and/or quantum gravity."
"That the LHC hasn’t seen evidence for new physics is to me a clear signal that we’ve been doing something wrong..."
it's the high profile quasi religious/philosophical meanderings of multiverses and quantum entanglement... the incompatibility of GR and quantum mechanics, the action-at-a-distance problem, these suggest to me we all should step back a bit and maybe reconsider some real practical physics basics... like the nature of empty space which is supposed to be physically measurable, and the implications, the possible analogue wave nature of light itself, the physical nature of charge as opposed to the way charge interacts with matter, possible analogue to digital nature of light and matter etc....obviously it's hard or even impossible for some people, who have invested up to a lifetime learning or studying or teaching in a particular consensus paradigm, for them, or even you, change may be unthinkable...
https://www.quantamagazine.org/20160809 ... r-physics/
That's roughly what I'm doing here in this Thunderbolts project... just looking and thinking about what other people are thinking about... and having some fun...
I think it's time to have a critical re-evaluation of what Faraday, Maxwell, de Broglie, Einstein etc believed, and see if we have taken a wrong turn somewhere... some more physical experiments.. this current notion of expanding space should have some measurable effect on the speed of light, gravity etc with some well designed laser experiments for example....
That said, I hope we can have a productive discussion. Back to my understanding of the CMB:
The primary reasons for my concerns about the validity of the CMB as evidence of the
hypothetical Big Bang are these:
1. At first the CMB radiation was measured on earth, pointing at the sky, and announced as isotropic.
2. Then for theoretical reasons, it was ***necessary to find anisotropy.
To find out more, I read "The First Three Minutes", George Smoot's "Wrinkles in Time", etc
3. I became concerned, it seemed to me impossible to 'take a picture' of the background sky, behind the dust and stars and planets and all the stuff that's obscuring the background horizon.
I'm an artist, and I have extensive experience with computers since 1981, and graphics software, images are an area I'm very familiar with, I know layer filters, layer modes, 3D skymaps, texturing of spheres etc.
So I'm skeptical of the fundamental claim that the images produced by COBE, WMAP and Planck are real. I don't believe that this 'feat' of viewing an image through the 'foreground contamination' is possible.
I'm suspicious of the enormous and valiant efforts of these obviously very gifted mathematicians to find the Emperor's clothes...
"The power spectrum is the cornerstone of the whole effort: it’s this statistical map that cosmologists base their CMB analysis on."
"The cosmologists then make some assumptions about what kind of universe they’re dealing with — in astrospeak, they assume the standard lambda-CDM model, which includes (1) a particular solution to the general relativistic equations of gravity, (2) a universe that looks basically the same on large scales and is expanding, (3) an early period of stupendous expansion called inflation, and (4) quantum fluctuations that seeded today’s large-scale matter distribution."
"From there, they start tweaking the assumptions, like a dressmaker tucking and letting out a dress pattern until it fits right. They could even chuck any assumption that proves to be bad. Eventually, they find the pattern that most successfully fits the CMB."
With respect, this proves the CMB as evidence for the Big Bang is based on wishful thinking.
Q.E.D.
"...Planck predicts about 2.5 times more clusters than are actually observed. This could be due to error in the estimates from either side, or due to ***new physics."
http://www.skyandtelescope.com/astronom ... 210201523/
The Planck data is apparently in a special format which is supposed to make the subsequent mathematical probabilities 'easier', but smells even more than Penzias and Wilson's pigeons:
Planck uses a different format which mixes a database and from these measurements, an image-like diagram can be extracted, it's not an picture of anything...
HEALPIX : "1. Hierarchical structure of the data base.
"This is recognised as essential for very large data bases,
and was postulated in construction of the Quadrilateralized Spherical Cube

(or quad-sphere, see
http://space.gsfc.nasa.gov/astro/cobe/skymapinfo.html),
which was used for the COBE data."
"An argument in favour of this proposition states that the data elements which are nearby in a multi-dimensional configuration space (here, on the surface of a sphere), are also nearby in the tree structure of the data base, hence the near-neighbour searches are conducted optimally in the data storage medium or computer RAM. This property, especially when implemented with a small number of base resolution elements, facilitates various topological methods of analysis, and allows easy construction of wavelet transforms on quadrilateral (and also triangular) grids."
"Figure 1 shows how a hierarchical partition with quadrilateral structure naturally allows
for a binary vector indexing of the data base."
"3. Iso-Latitude distribution of discrete area elements on a sphere. This property is critical for computational speed of all operations involving evaluation of spherical harmonics.
"Since the associated Legendre polynomial components of spherical harmonics are evaluated via slow recursions, and can not be simply handled in an analogous way to the trigonometric Fast Fourier Transform, any deviations in the sampling grid from an iso-latitude distribution result in a prohibitive loss of computational performance with the growing number of sampling points, or increasing map resolution."
"It is precisely this property that the COBE quad-sphere is lacking, and this renders it impractical for applications to high resolution data."
"A number of tessellations have been used for discretisation and analysis of functions on the
sphere (for example, see (Driscoll & Healy (1994)), (Muciaccia, Natoli & Vittorio (1998)) —
rectangular grids, (Baumgardner & Frederickson (1985)), (Tegmark (1996)) — icosahedral
grids, (Saff & Kuijlaars (1997)), (Crittenden & Turok (1998)) — ‘igloo’ grids, and (Szalay
& Brunner (1998)) — a triangular grid), but none satisfies simultaneously all three stated
requirements."
"All three requirements formulated above are satisfied by construction with the Hierarchi-
cal Equal Area, iso-Latitude Pixelisation (HEALPix) of the sphere, which is shown in
Figure 2. A more detailed description of HEALPix, its motivations, and applications can
be found in (Gorski et al (2005))."
Q: Why on earth would you have all this complexity in an image map?
A: To make the subsequent probability functions easier.
http://healpix.jpl.nasa.gov/pdf/intro.pdf
At the beginnings of COBE, megapixel files were considered huge:
"The ultimate data products of these missions — multiple microwave sky maps, each of which
will have to comprise more than ∼10^6 pixels in order to render the angular resolution
of the instruments — will present serious challenges to those involved in the analysis and
scientific exploitation of the results of both surveys."
"As we have learned while working with the COBE mission products, the digitised sky map
is an essential intermediate stage in information processing between the entry point of
data acquisition by the instruments — very large time ordered data streams, and the final
stage of astrophysical analysis — typically producing a ’few’ numerical values of physical
parameters of interest."
"COBE-DMR sky maps (angular resolution of 7◦(FWHM) in three frequency bands,
two channels each, ***6144 pixels per map) were considered large at the time of their release."
"As for future CMB maps, a whole sky CMB survey at the angular resolution of ∼10'
(FWHM), discretised with a few pixels per resolution element (so that the discretisation
effects on the signal are sub-dominant with respect to the effects of instrument’s angular
response), will require map sizes of at least N pix ∼ a few × 1.5 10^6 pixels."
This was written, when indeed an image file of a few megapixels probably did seem large...
nothing by today's standards, but since we're talking here about the image of 'creation'
the most amazing image of all time according to some,
An image of less than 4 megapixels! I was shocked at how low res it was...
Someone please tell me I'm wrong here...
Then the raw satellite data have to be massaged to create an image,
since they are basically a set of differential numbers,
here on earth you can't get anything but an isotropic flat image...
Here are some comments about the processing of COBE data:
How to take stuff out:
"it is customary to remove a best-fit monopole and dipole from the data
before performing any further analysis.
"Unfortunately, since incomplete sky coverage destroys the orthogonality of
the spherical harmonics, this procedure **covertly removes part of the
contribution of the higher multipoles."
"There are two ways to compensate for this. The first option is to treat the monopole
and dipole coefficients <...> as nuisance parameters,"
i.e., quantities whose true values we neither know nor care about."
and how to put stuff back in:
"In the context of Bayesian analysis, the natural thing to do with
nuisance parameters is to marginalize over them."
"...if we are performing some sort of data compression, then we have a second
option for dealing with the monopole and dipole."
"We can simply impose a constraint on our compression matrix A, requiring
it to be insensitive to the unwanted multipoles."
"People frequently remove the quadrupole information from the COBE data
in the same way as the monopole and dipole, on the grounds that the
quadrupole is particularly susceptible to Galactic contamination."
'Anomolies' are noted:
"It has also been known since the earliest days of COBE analysis
that the quadrupole is anomalously low
(compared to the prediction of a power spectrum normalized
to the other multipoles)."
The issue of ethics is touched on:
"From a statistical point of view, this is a delicate situation:
it is perfectly acceptable, and even wise, to throw away data
if there is a reasonable fear of contamination, but throwing away data
that is known a priori to be discordant with favored theories
is a major statistical faux pas...."
https://uam.es/personal_pas/txrf/frm/as ... 607088.pdf
This is why it appears impossible for normal unsophisticated people like myself to find out the equivalent image resolutions of the different skymaps. It's the image data is embedded in the HEALPix file, which really has a whole bunch of numbers, stored at different resolutions meaning different things....
There is no resolution, it's buried in multiple mathematical datasets that have different 'resolutions' depending what you want to examine like field density... the cartographic projection is a new one, never seen in science before.. a combination of two different projections... all to ease the probability functions
But the Emperor has no clothes! It ***cannot work...
In an analogy, if you were to give me a rectangular school photo, of a set of people, in a typical format, with a resolution of say 2500 x 1800 pixels, and then I were to superimpose a layer containing an image from the Planck team of just the foreground contamination recorded by that satellite, and then 'flattened' that image layer, it would destroy most of the image information in the school photo... I would be left with a hybrid image of no scientific or nostalgic value... I might recognise a few faces provided they weren't near the galactic centre, but most picture information and quality would be lost.
To pretend that I could get the school photo faces back by Bayesian or Monte Carlo probability functions is just not believable...
In the case of the CMB, the theory is that the original background, aka the school photo is almost completely featureless, and hides behind dust maps, stars, dipole moments and such.. to 'see' the smooth background you have to 'subtract' all the dirt, dust and pigeon poop, and you are left with gaps.
These holes of missing information are then mathematically filled in by a very fancy probabilistic clone tool, which makes the unbelievable claim that it can introduce information which has never been, or 'probably' never can be made visible or measurable, then we have to believe we can examine an edited photoshopped 'smoothed' skymap to distinguish values to a precision of approx 3 in 100,000
I just don't believe that the CMB radiation is from the Big Bang. ... convince me otherwise
