Abstract:
The uranium and thorium decay series (hereafter "U-series") include the nuclides of ten elements, all of which can be found at trace levels in rocks and minerals. The relatively short half-lives of the U-series nuclides give them considerable potential to decipher a wide variety of natural processes. The common observation of secular radioactive disequilibrium between parent and daughter nuclides provides a time dimension that is not possible with the more commonly used trace elements. However, just like conventional trace elements, the behavior of U-series elements depends on their partitioning between coexisting phases, such as minerals and melts. Interpreting radioactive disequilibrium behavior of the U-series critically requires an understanding of how parent and daughter nuclides of these elements are fractionated one from another under the conditions of interest. Without appropriate partition coefficients (D) it is difficult to separate that part of any disequilibrium signal that is due to process and that part which is due to time. This problem is minimized, but by no means eliminated, by the use of activity ratios rather than concentration ratios, as conventionally used for trace elements. But still, there is very little point in detennining isotopic concentrations at the sub-femtogram level, if the data themselves cannot be interpreted or modeled with comparable precision and accuracy. Unfortunately, with the partial exception of U, Th and Pb, our knowledge of partitioning of the U-series elements lags well behind our ability to measure them. The problem is exacerbated by the fact that, because nearly all of the elements of interest are highly incompatible (D « 1) in all common silicate and oxide minerals, and many lack stable or long half-life isotopes, there are serious technical difficulties associated with detennining their partition coefficients experimentally