A paper is being published that claims that there is a real and observed seasonal variation to nuclear decay.
The decay changed are being observed in silicon-32 and radium-226 and were reported independently of each other. (Which would rule out that there’s something unique about the decay in one nuclear reaction – the fact that it’s been observed in two systems and by two different teams would mean that this is a real and likely universal effect.)
The really curious thing is that this variation seems to depend on the season of the year – which would lead us to think that it has something to do with the earth’s distance from the sun.
So what could account for this?
According to the authors of the paper, as reported by the arXiv blog:
“Jenkins and co put forward two theories to explain why this might be happening.
First, they say a theory developed by John Barrow at the University of Cambridge in the UK and Douglas Shaw at the University of London, suggests that the sun produces a field that changes the value of the fine structure constant on Earth as its distance from the sun varies during each orbit. Such an effect would certainly cause the kind of an annual variation in decay rates that Jenkins and co highlight.
Another idea is that the effect is caused by some kind of interaction with the neutrino flux from the sun’s interior, which could be tested by carrying out the measurements close to a nuclear reactor (which would generate its own powerful neutrino flux).
It turns out, that the notion of that nuclear decay rates are constant has been under attack for some time. In 2006, Jenkins says the decay rate of manganese-54 in their lab decreased dramtically during a solar flare on 13 December.
And numerous groups disagree over the decay rate for elements such as titanium-44, silicon-32 and cesium-137. Perhaps they took their data at different times of the year.”
Read the full article here.
Neat huh?