I will start by saying that I have a rule of thumb: Y2038 doesn't matter. You might think this rather short-sighted of me, but I will be retired by then. This has been a choice on my part after dealing with time programming for a while.
It started before Y2K. The developers at the small company I worked for had a friendly "demerit" board, just for fun. I even think I might have been the person to start it, but I was certainly the person who finished it, achieving 86,400 demerits in a single day. I had written a small time app and on February 1st (yes, really February 1st, so it probably should have been 86,401 demerits) it goofed a leap year calculation and was off by a whole day. I got one demerit per second.
In my current job I'm wading through some hairy time code, involving stock exchange delays, timezones and summer daylight saving time. The code is ancient C code and I'm supposed to refactor it to C++ and make it elegant. It's full of time_t and struct tm, as well as a custom type that is a "weekly time": a time that measures how long since the week started. (It doesn't matter what year it is when figuring out how long to delay a stock ticker report). It's really kind of fun, but is prone to the 86,400 demerit kind of pitfalls I have fallen into before.
To wit: one should always use symbolic constants, such as minutesPerDay, and never numeric expressions, such as 24 * 60. Even with such an obvious meaning as 24 hours and 60 minutes, sometimes it comes out 20 * 64, which is a different enough number to cause issues. Test Driven Development found this before it could get me any demerits.