re: the UNIX 2038 problem...
from WIKI....
Alternative proposals have been made (some of which are already in use), such as storing either milliseconds or microseconds since an epoch (typically either 1 January 1970 or 1 January 2000) in a signed 64-bit integer, providing a minimum range of 300,000 years at microsecond resolution.[18][19] In particular, Java's use of 64-bit long integers everywhere to represent time as "milliseconds since 1 January 1970" will work correctly for the next 292 million years.
Anyone REALLY care about 300,000 years broken down into MICROSECONDS ??