
On 22 August 2016 at 18:38, Paul Eggert <eggert@cs.ucla.edu> wrote:
Jon Skeet wrote:
Is that due to dates past 2038, or something else?
Also dates before 1901 for 32-bit signed time_t, or before 1970 for unsigned time_t. I want the pre-1901 transitions to be checked, though, so I would rather stick with 64-bit signed time_t when generating the reference file.
Plus, on some platforms zdump uses CRLF instead of LF to terminate output lines. There may be other niggling things like that.
Right. My own format spec <https://github.com/nodatime/tzvalidate/blob/master/format.md> explicitly calls out U+000A, so that's consistent with using a Unix 64-bit version of zdump to generate the canonical file.
I'd be testing something where time_t doesn't get involved at all, of
course - an entirely different, non-C-based representation. That's the point of it, from my perspective.
Something with bignums, say? (Because 64-bit signed time_t doesn't suffice for simulations of proton decay in degenerate stars. See:
Adams FC. The future history of the universe. Cosmic Update. 2011-07-23. http://dx.doi.org/10.1007/978-1-4419-8294-0_3
If we still have time zones that need this sort of support in even a thousand years time, then... well, heck, *I* won't be maintaining it :) Given the other reactions around file merging, perhaps the data file should just be hosted as a separate file? Jon