Recent puzzling observations of tiny variations in nuclear decay rates have led some to question the science behind carbon-14 dating and similar techniques.
However scientists tested the hypothesis that solar radiation might affect the rate at which radioactive elements decay and found no detectable effect.
The maximum neutrino flux in the sample in their experiments was several times greater than the flux of neutrinos from the sun.
The researchers followed the gamma-ray emission rate of each source for several weeks and found no difference between the decay rate of the spheres and the corresponding foils.
Many scientists, including Marie and Pierre Curie, Ernest Rutherford and George de Hevesy, have attempted to influence the rate of radioactive decay by radically changing the pressure, temperature, magnetic field, acceleration, or radiation environment of the source.
No experiment to date has detected any change in rates of decay.
Data from laboratories in New York and Germany also have shown similarly tiny deviations over the course of a year.
Radiometric dating, for instance, will say that deeper levels of sediment are older than shallower levels of sediment.
It will give similar fossils similar ages, even when the fossils are widely separated.
Radiometric dating is very reliable in theory - the decay of radioactive materials is very-very predictable.
But like any other bit of experimental physics "the difference between practice and theory is small in theory but large in practice." It's especially tricky for Carbon14 dating (which most recent stuff relies on).