First, age determinations. Radiocarbon dating is based on the observation that given 14C in previously living organisms decays at a constant, predictable rate. In this case, half of a sample's 14C decays in about 5730 years in exponential fashion. This means, that after 5730 years, 1/2 of the 14C of a previously living organism remain, 1/4 remains after 11,460 years, etc. The uncertainty or error range reported for all radiocarbon dates is due to imprecision in counting the radioactive decay of carbon atoms in a sample, and it is a critical component of the date. What the error range indicates is a 66% chance that the age of a sample falls within the interval it brackets. Double the error range, and the resulting interval is 95% likely to include it. What is important to note is that raw radiocarbon age ranges are centered on the date that is statistically most likely to be the correct one for a dated sample. In that sense, it is somewhat warranted to use the date as a shorthand to discuss how old a sample is. That is, for a bone point dated to 13,400 +/- 100BP (these dates are expressed as before present, with the present assumed to be 1950), it is technically OK to say that it is a 13,400 year-old point, since that age is the most likely to be correct within the interval defined by the error range.
The problem, however, is that radiocarbon years don't correspond to calendar years, and usually underestimate the true age range of any given sample. This is because the concentration of atmospheric radiocarbon has not been constant over time. However, this problem can be corrected through the use of calibration curves based on the radiocarbon dating of samples of known age and extrapolated from the discrepancy between the two ages. Samples whose calendar age can be determined include historical artifacts, as well as organic remains that grow or accumulate in yearly increments, such as trees (that accumulated a new grwoth ring yearly) or corals.
Until recently, reliable calibration curves only stretched back to ca. 24,000 years BP, but recent developments have extended the range of calibration curves past 40,000 years BP, although some debate remains about some of the finer details of these more extensive curves. Regardless, from the perspective of paleoanthropology and especially that research focused on the timing of the disappearance of the Neanderthals, this has been a real boon, since it allows researchers to finally discuss this process in the chronology of the rest of recent human prehistory. For the transition interval (i.e., the period 30-40ky BP), the discrepancy between radiocarbon and calendar ages has been argued to be on the order of 5000 years, meaning that a radiocarbon date of, say 35,000 BP translates into a calendar (or calibrated) age of about 40,000BP.
This precision is good, but in my view, it's been abused somewhat on two levels. First, people wanting to emphasize how old a given object or associated assemblage is now systematically use calibrated ages. The reverse, however, is not true and the age of unexpectedly recent finds such as the late-lasting Mousterian assemblages of Gibraltar routinely continue to be presented in radiocarbon ages (i.e., ca. 28,000 BP, as opposed to say 32-33,000 cal. BP). I supposed this practice makes sense from a PR perspective, but it certainly muddles arguments about prehistoric chronology, especially for that (large) segment of the public that doesn't understand the subtleties of radiocarbon dating. It also creates a gap between earlier research that published radiocarbon ages and current papers that use calibrated ages or, worse, a combination of both.
The second problem is that most researchers and science journalists continue yo present and discuss calibrated ages in terms of their central tendencies when this is absolutely unwarranted. This is due to the fact that calibration curves are based on irregularities in atmospheric radiocarbon concentrations at various times in the past. This means that the smooth curve centered on a given age that is obtained by radiocarbon dating turns into a curve often best described as a hair-raising rollercoaster. Here's an example from the OxCal web site to illustrate what I mean:
This plot shows how the radiocarbon measurement 3000+-30BP would be calibrated. The left-hand axis shows radiocarbon concentration expressed in years `before present' and the bottom axis shows calendar years (derived from the tree ring data). The pair of blue curves show the radiocarbon measurements on the tree rings (plus and minus one standard deviation) and the red curve on the left indicates the radiocarbon concentration in the sample. The grey histogram shows possible ages for the sample (the higher the histogram the more likely that age is).
This means that while the calibrated range corresponds to that of the original radiocarbon date, the mid-point of that range is not necessary the most likely one. This therefore means that, unlike for raw radiocarbon dates, it is often unwarranted to use the midpoint of a calibrated radiocarbon age range as shorthand for the most likely calibrated age of that sample.
I've simplified this discussion some for the sake of clarity (and kept it reference-free for the same reason), but it should now be clear why it really grinds my gears to see calibrated ages tossed around uncritically in the literature and, especially, in newsreports that discussed recent research that has an important chronometric dimension to it.