Speed of light challenge proposed by physicists, according to New Scientist 23 November 2016 and ScienceDaily 25 November 2016.
Many theories in physics and cosmology, including Einstein’s theory of relativity, are based on the belief that the speed of light in a vacuum is always constant. In the late 1990’s this assumption was questioned by João Magueijo of Imperial College London, because of a problem with the Big Bang Theory, known as the horizon problem. If the speed of light, and all electromagnetic radiation, has always been the same, then the universe is too big for it to be evenly heated in the time since the Big Bang. The ScienceDaily article explains: “As an analogy, to heat up a room evenly, the warm air from radiators at either end has to travel across the room and mix fully. The problem for the universe is that the ‘room’ – the observed size of the universe – appears to be too large for this to have happened in the time since it was formed”.
The current explanation for this problem is a theory called “inflation” which claims the early universe started out very small and then suddenly underwent a period of extremely rapid expansion. However, this theory requires the invention of an “inflation field” – a temporary set of conditions that mysteriously came into being sometime soon after the beginning and then, equally mysteriously, ceased.
To overcome this problem Magueijo suggested the speed of light was much faster in the past. He and a colleague, Niayesh Afshordi at the Perimeter Institute, Canada, are now proposing a means of testing this theory using measurements of the cosmic background radiation. On the basis of their theory they have made a prediction about a measurement known as the “spectral index”, which relates to small variations in the cosmic microwave background radiation, currently being studied by instruments on satellites. According to ScienceDaily “Their figure is a very precise 0.96478. This is close to the current estimate of readings of the cosmic microwave background, which puts it around 0.968, with some margin of error”.
Magueijo commented: “The theory, which we first proposed in the late 1990’s, has now reached a maturity point – it has produced a testable prediction. If observations in the near future do find this number to be accurate, it could lead to a modification of Einstein’s theory of gravity. The idea that the speed of light could be variable was radical when first proposed, but with a numerical prediction, it becomes something physicists can actually test. If true, it would mean that the laws of nature were not always the same as they are today.”
Editorial Comment: We would like to remind Magueijo, along with both creationists and evolutionists who are tackling some of the problems brought up by modern cosmology theories, that the theory of decreasing speed of light was not first proposed in the late 1990’s. Australian Barry Setterfield, with his colleague Trevor Norman from the Flinders University in Adelaide, proposed this theory in the early 1980s. This editor attended one of Setterfield’s lectures on the subject in 1983, and Setterfield and Norman had been working on it for several years before that. Setterfield did not pass the Politically Correct barrier as he was a creationist. Since then he has developed the theory further, and his research and reports can be read at: http://www.setterfield.org/.
It will be interesting to see what the results of Magueijo’s and Afshordi’s proposal turn out to be. However, their theory is still based on the Big Bang, which has other problems, so we caution anyone from getting too excited about it. Instead, we advise cosmologists and physicists to start with the word of the Creator, who said “Let there be light” in the beginning.
Evidence News vol. 16 No. 24
14 December 2016
Creation Research Australia
Were you helped by this item? If so, consider making a donation so we can keep adding more answers. For USA tax deductible donations click here. For UK tax deductible donations click here. For Australia and rest of world click here.