Find the word definition

The Collaborative International Dictionary
year 2000 problem

millennium bug \mil*len"ni*um bug`\ (m[i^]l*l[e^]n"n[i^]*[u^]m b[u^]g`), n. (Computers) An error in the coding of certain computer programs which store the year component of the date as two digits, assuming that the first two digits are 19, rather than as a complete number of four digits; when such programs are used after January 1, 2000, the date may be misinterpreted, causing serious errors or total failure of the program; -- called also year 2000 bug, year 2000 problem and Y2K bug.

Note: In the several years leading up to the year 2000, large corporations and other users of computers in total spent many billions of dollars correcting this error in the programs they use.

year 2000 problem

year 2000 bug \year 2000 bug\, year 2000 problem \year 2000 problem\, n. (Computers) an error in the coding of certain computer programs in which the year portion of dates was represented by only two decimal digits, assuming that the first two digits are ``19''. In such a program the the year 1975 is represented as ``75''. This was a common practise in computer programming even into the 1990's, as many programmers failed to consider that their programs would be used after the year 1999. Thus, with such a program, a person born in 2000 would be considered as 101 years old in 2001; many different serious problems, as various as the programs, could be caused by such an error.

Note: In 1998 many programs with the year 2000 bug were still not corrected, and it is not clear how many programs will retain the bug when the year 2000 arrives. Tune in then.

Syn: millemium bug, Y2K bug, Y2K problem. [PJC]

Wiktionary
year 2000 problem

n. the millennium bug

Wikipedia
Year 2000 problem

The Year 2000 problem is also known as the Y2K problem, the Millennium bug, the Y2K bug, or Y2K. Problems arose because programmers represented the four-digit year with only the final two digits. This made the year 2000 indistinguishable from 1900. The assumption, that a twentieth-century date was always understood, caused various errors, such as the incorrect display of dates, and the inaccurate ordering of automated dated records or real-time events.

In 1997, the British Standards Institute (BSI) developed a standard, DISC PD2000-1, which defines "Year 2000 Conformity requirements" as four rules: No valid date will cause any interruption in operations. Calculation of durations between, or the sequence of, pairs of dates will be correct whether any dates are in different centuries. In all interfaces and in all storage, the century must be unambiguous, either specified, or calculable by algorithm. Year 2000 must be recognized as a leap year. It identifies two problems that may exist in many computer programs.

Firstly, the practice of representing the year with two digits became problematic with logical error(s) arising upon "rollover" from x99 to x00. This had caused some date-related processing to operate incorrectly for dates and times on and after 1 January 2000, and on other critical dates which were billed "event horizons". Without corrective action, long-working systems would break down when the "... 97, 98, 99, 00 ..." ascending numbering assumption suddenly became invalid.

Secondly, some programmers had misunderstood the Gregorian rule that determines whether years that are exactly divisible by 100 are not leap years, and assumed the year 2000 would not be a leap year. Years divisible by 100 are not leap years, except for years that are divisible by 400. Thus the year 2000 was a leap year.

Companies and organizations worldwide checked, fixed, and upgraded their computer systems.

The number of computer failures that occurred when the clocks rolled over into 2000 in spite of remedial work is not known; among other reasons is the reluctance of organisations to report problems.