Some computer standards are defined in terms of Greenwich mean time (GMT), which is equivalent to universal time (UT).GMT is the "civil" name for the standard; UT is the "scientific" name for the same standard.After adjusting the year number, 1900 is subtracted from it.For example, if the current year is 1999 then years in the range 19 to 99 are assumed to mean 1919 to 1999, while years from 0 to 18 are assumed to mean 2000 to 2018.
In UTC, however, about once every year or two there is an extra second, called a "leap second." The leap second is always added as the last second of the day, and always on December 31 or June 30.
Current debates about which are the “best” fossil dates for calibration move to consideration of the most appropriate constraints on the ages of tree nodes.
Because fossil-based dates are constraints, and because molecular evolution is not perfectly clock-like, analysts should use more rather than fewer dates, but there has to be a balance between many genes and few dates versus many dates and few genes.
The distinction between UTC and UT is that UTC is based on an atomic clock and UT is based on astronomical observations, which for all practical purposes is an invisibly fine hair to split.
Because the earth's rotation is not uniform (it slows down and speeds up in complicated ways), UT does not always flow uniformly.