Photons only slowly emerge from the Sun’s core. Neutrinos just pass through once produced.
Radioactive decay is supposed to be the ultimate random process, immutably governed by an element’s half life and nothing else. There is no way to determine when a single radioactive atom will spontaneously decay, nor any way to speed-up or slow down the process. This iron clad certainty has always been the best argument of opponents to conventional nuclear fission power generation, as it means that the inevitable nuclear waste will have to be kept isolated from the biosphere for million of years (notwithstanding recent research attempts at stimulated transmutation of some of the longer lasting waste products.)
When plotting the activity of a radioactive sample you expect a graph like the following, a smooth decrease with slight, random variations .
Detected activity of the 137Cs source. The first two points correspond to the beginning of data taking. Dotted lines represent a 0.1% deviation from the exponential trend. Residuals (lower panel) of the measured activity to the exponential fit. Error bars include statistical uncertainties and fluctuations.
(This graph stems from a measurement on the Beta decay of 137CS and was takendeep underground).
What you don’t expect are variations that follow a discernible pattern in the decay rate of a radioactive element, nor any correlation with outside events. But this is exactly what Jere H. Jenkins et al. found:
Plot of measured 36Cl decays taken at the Ohio State University Research Reactor (OSURR). The crosses are the individual data points, and the blue line is an 11-point rolling average. The red curve is the inverse of the square of the Earth–Sun distance. (Error bars are only shown for a limited number of points).
And now this surprising result of the sun’s influence has been corroborated.
The latest research was a collaboration of Stanford and Purdue University with the Geological Survey of Israel, rather reputable research power-houses that make these results difficult to dismiss. Their paper contains the following contour graph for the measured gamma decay during the day plotted over several years. When comparing this with the same kind of graph of the sun’s inclination during the observed date range the correlation is quite obvious:
Gamma measurements as a function of date and time of day. The
color-bar gives the power, S , of the observed signal (top).
Solar elevation as a function of date and time of day. The color-bar
gives the power, S , of the observed signal (bottom).
There is a video talk on this phenomenon available online. It takes some patience to sit through, but gives a more complete picture in explaining how these observed patterns can be correlated to the the Sun’s core activity with surprising accuracy.
The evidence for the reality of this effect is surprisingly good, and that is rather shocking. It does not fit into any established theory at this time.
Update and Forums Round-Up
This was the second blog post from this site that has been picked up on slashdot (this was the first one). Last time around WordPress could not handle the load (dubbed slashdot effect). Subsequently I installed the W3 Total Cache plug-in. So before getting back to the physics, I want to use this space to give them a big shout-out. If you operate a WordPress blog I can highly recommend this plug-in.
This article received almost 30,000 views over two days, the resulting discussions fleshed out some great additional information, but also highlighted what can be easily misread or misconstrued. Top of the list was the notion that this might undermine carbon dating. For all practical purposes, this can be categorically ruled out. For this to have a noticeable effect, this phenomenon would have to be much more pronounced. The proposed pattern is just slightly outside the error bars and only imposes a slight variation on top of the regular decay pattern. Archaeologists should not lose sleep over this. An unintended side-effect was that this attracted creationists. If you adhere to this belief please don’t waste your time commenting here. This is a site dedicated to physics, and off-topic comments will be treated like spam and deleted.
Another source of confusion was the difference between induced radioactive reactions and spontaneous decay. The latter is what we are supposed to see when measuring the decay of a radioactive isotope in the lab and this is what these papers address. Induced transmution is what can be observed when matter is, for instance, irradiated with neutrons. This process is pretty well understood and happens as a side effect within nuclear reactors (or even a nuclear warhead before the fission chain reaction overwhelms all other neutron absorption). The treatment of nuclear waste with a neutron flux is what I hinted at in the last sentence of the first paragraph. This emerging technology is very exciting and merits its own article, but it is an entirely different story. The news buried in the papers discussed here is that there may be a yet unknown neutrino absorption reaction influencing decay rates that were believed to be only governed by the half-life time interval. At this point an inverse beta decay is known to exist, but the reaction rate is much smaller than what is required to explain the phenomenon that these papers claim.
The spontaneous decay of a radioactive isotope is regarded as the gold standard for randomness in computer science, and there are some products that rely on this (h/t toDennis Farr for picking up on this). I.e. if the decay rate of a lump of radioactive material is no longer governed by the simple function N(t)=N02−t/t1/2 then the probability distribution that these random number generators rely on is no longer valid (the decay constant used in the distribution function at the link relates to the half-life time vie t1/2=ln2λ.
There were various thoughtful critical comments on the methodology and experimental set-up. The most prominent point that came up was the contention that this was essentially the outcome of data-mining for patterns and then hand-picking results that showed some discernible patterns. Ironically, this approach is exactly the kind of data processing that spawned a billion dollar industry catering to the commercial Business Intelligence market. To me, this actually looks like a pretty smart approach to get some more mileage out of old data series (assuming the authors didn’t discard results detrimentally opposed to their hypothesis). The downside of this is the lack of information on the care that went into collecting this data in the first place. I.e. it was repeatedly pointed out that experimenters should run a control to capture the background radiation and needed to understand and control for the environmental impact on their measuring devices. Relying on third party data means also relying on the reputation of the researchers who conducted the original experiments.
When the original claims were made they triggered follow-up research. Some of it was inconclusive, some of it contradicted the findings and a measurement performed on the Cassini probe’s 238Pu thermonuclear fuel clearly ruled out any sun-distance related influence on that alpha emitter.
Inevitably with controversial results like this the old moniker that “extraordinary claims require extraordinary proof” is repeatedly dragged out.
I always thought this statment was cut off a bit short and should really read: “Extraordinary claims require extraordinary proof and merit extraordinary attention.“
Because without the latter, sufficient proof may never be acquired even if it is out there. The experiments required to test this further are not expensive. An easy way to rule out seasonality it to perfom these measurements closer to the equator or have them performed at the same time in a north and south American lab as one slashdot poster suggested.
Ultimately, a Beta emitter measurement on another space probe could lay this to rest and help to conclusively determine if this is a real effect. It would be very exciting if this can be confirmed but it is certainly not settled at this point.