an error in the coding of certain computer programs in which the year portion of dates was represented by only two decimal digits, assuming that the first two digits are “19”. In such a program the the year 1975 is represented as “75”. This was a common practise in computer programming even into the 1990's, as many programmers failed to consider that their programs would be used after the year 1999. Thus, with such a program, a person born in 2000 would be considered as 101 years old in 2001; many different serious problems, as various as the programs, could be caused by such an error.
"In 1998 many programs with the
academic year, anomalistic year, astronomical year, balance-of-payments problem, bissextile year, calendar year, christian year, church year, civil year, common year, each year, equinoctial year, every year, financial year, fiscal year, great year, health problem, holy year, homework problem, intercalary year, jewish new year, leap year, light year, lunar year, mohammedan year, new year, off year, per year, platonic year, problem, problem solver, problem solving, race problem, sabbatical year, school year, sidereal year, skin problem, solar year, three year old, time of year, tropical year, two year old, year, year 2000 bug, year 2000 compliant, year dot, year of grace