Dating schemes based on rates of radioactivity have been refined and scrutinized for several decades.
American physical chemist Willard Libby led a team of scientists in the post World War II era to develop a method that measures radiocarbon activity.I'm assuming that you mean "how was carbon dating shown to be an accurate method for estimating the age of a sample." I'm also assuming that you know that radiocarbon dating is only one of many methods of radiometric dating—that most rock samples, for instance, aren't dated using carbon at all; if they're datable (which not every rock is), the method would be potassium-argon dating, and/or uranium-lead dating, and/or rubidium-strontium dating, and/or fission-track dating, and/or. In fact, there are quite a number of factors that can throw off the accuracy of a given carbon-14 date. Libby's updated version of the "Curve of the Knowns", with some added data points, looked like this: You'll note that while the fit is still close, not every data point falls exactly on the curve.Radiometric dating is self-checking, because the data (after certain preliminary calculations are made) are fitted to a straight line (an "isochron") by means of standard linear regression methods of statistics.
The slope of the line determines the date, and the closeness of fit is a measure of the statistical reliability of the resulting date.
With initial large margin of error and anything that did not square with expectation, judged as contaminated, the method appeared to work and was hailed as completely reliablejust as the atomic clock is reliableand this nobody doubted.