One possibility for the accelerated decay comes with the possibility of variable speed of light.
While astronomers have found that magnetars emit radiation that could cause bouts of accelerated decay, and that these bouts may be more common than originally thought, the amount of heat produced by the radiation during the short period presents a problem for creationists.
A more common approach is to allow for accelerated nuclear decay during the early portion of terrestrial history, when those elements which decay naturally were buried far below the crust (or far below the waters of the global flood, in some models), therefore dealing with the heat problem.
As any first-year student of algebra soon learns, a single equation with two unknown variables cannot be solved.
In fact, the above formula is far too simple, because it assumes that the amount of daughter isotope was zero at start.
The formula below is a proper model that admits the possibility that some daughter isotope was present when the rock formed: where D is the amount of daughter isotope present at start.
In order to simplify the formula, scientists generally assume that igneous rock contains no argon when it forms, because the argon, being a noble gas, would escape from the cooling lava. Fresh volcanic rock is routinely found to have argon in it when it first cools.
The proportion of argon to radioactive potassium in the sample today is observable, and the decay constant of potassium is readily calculable by measuring the amount of argon produced from the decay of K after a specified time.
But the age of the rock and the proportion of argon to radio-potassium in the sample originally are not observable.
But new research by creationists has revealed a large number of problems with radiometric dating.
In some cases such as Carbon-14 dating, radioactive dating actually gives strong evidence for a young Earth.
Radiometric dating is based on the decay rate of these isotopes into stable nonradioactive isotopes.