It is assumed that the ratio has been constant for a very long time before the industrial revolution.
Is this assumption correct (for on it hangs the whole validity of the system)?
Back in the 1940s, the American chemist Willard Libby used this fact to determine the ages of organisms long dead.
(Ham et al., page 68.) C ratio in the past, or that this is "the technique's Achilles' heel" is incorrect.
The whole validity of radiocarbon dating for the past 10,000 years---the time span of interest to biblical chronology---hangs only on the tree-ring chronologies which are used to calibrate it. .) This process does not involve any assumption about historic radiocarbon to stable carbon ratios because the radiocarbon concentration in the tree-ring samples would be affected in exactly the same way as the radiocarbon concentration in the specimen to be dated. To quote again from The Answers Book: Some recent, though controversial, research has raised the interesting suggestion that c (the speed of light) has decreased in historical times. If it is correct, then radioactive decay rates would automatically be affected, and would show artifically high ages.
But as the method was refined, it started to show rather regular anomalies.
First, it was noticed that, when radiocarbon dated, wood grown in the 20th century appears more ancient than wood grown in the 19th century.
Professor Willard Libby produced the first radiocarbon dates in 1949 and was later awarded the Nobel Prize for his efforts.