Looks like zdump claims Dublin has DST during winter, not summer: $ ./zdump -i -c2020,2021 Europe/Dublin TZ="Europe/Dublin" - - +00 GMT 1 2020-03-29 02 +01 IST 2020-10-25 01 +00 GMT 1 Compare with the output for London, which is as I would expect: $ ./zdump -i -c2020,2021 Europe/London TZ="Europe/London" - - +00 GMT 2020-03-29 02 +01 BST 1 2020-10-25 01 +00 GMT Generally, I think it's neater if the time stretches with the larger GMT values are considered DST. This is almost always true, except for Europe/Dublin, Eire, Africa/Casablanca and Africa/El_Aaiun. Morocco (and Western Sahara) change the time to have shorter fasts during Ramadan, so to speak to *not* save daylight. I would therefore still argue that the time outside Ramadan should be considered DST and the time during Ramadan the non-DST time. I am aware that the view "DST means larger GMT value" is not a precise definition (a region that change times zones in quick succession might easily accrue more than two GMT values within a few years). It's therefore not a great candidate for implementation in zic/zdump. However, I hold that periodic *future* annual back-and-forth switches should adopt that principle. Anyway, over to you Stefan PS: You are providing an *amazing* service - many thanks for that!
On 12/7/19 9:21 AM, Stefan Rueger wrote:
I think it's neater if the time stretches with the larger GMT values are considered DST.
It is a controversial area because there are two competing objectives: having standard time be the standard UT offset ('std' in POSIX) vs having DST's offset be greater than standard time's. As noted in the 'europe' file, in a comment from Joseph S. Myers (2005-01-26): "(Note that the time in the Republic of Ireland since 1968 has been defined in terms of standard time being GMT+1 with a period of winter time when it is GMT, rather than standard time being GMT with a period of summer time being GMT+1.)" and we eventually went with that interpretation, as opposed to the one you suggested. Morocco is similar, as are some other historic uses of "winter time" as DST. (Whether this practice "saves" daylight depends on whether one prefers morning to evening daylight. :-) If you prefer the other interpretation, you can get it with 'make rearguard_tarballs'.
On Dec 10, 2019, at 3:02 PM, Paul Eggert <eggert@cs.ucla.edu> wrote:
On 12/7/19 9:21 AM, Stefan Rueger wrote:
I think it's neater if the time stretches with the larger GMT values are considered DST.
It is a controversial area because there are two competing objectives: having standard time be the standard UT offset ('std' in POSIX) vs having DST's offset be greater than standard time's. As noted in the 'europe' file, in a comment from Joseph S. Myers (2005-01-26):
"(Note that the time in the Republic of Ireland since 1968 has been defined in terms of standard time being GMT+1 with a period of winter time when it is GMT, rather than standard time being GMT with a period of summer time being GMT+1.)"
and we eventually went with that interpretation, as opposed to the one you suggested.
I.e., it *is* DST, except that the "S" stands for "spending" rather than "saving". :-) Either that, or "S" stands for "shifting", in both the case of shifting the clock forward from standard time in the spring and the case of shifting the clock *backward* from standard time in the autumn. (That also has the advantage of not claiming anything's "saved".) C18 says that Some functions deal with local time, which is the calendar time expressed for some specific time zone, and with Daylight Saving Time, which is a temporary change in the algorithm for determining local time. The local time zone and Daylight Saving Time are implementation-defined. and that The value of tm_isdst is positive if Daylight Saving Time is in effect, zero if Daylight Saving Time is not in effect, and negative if the information is not available. which leaves the interpretation to the implementation, as it doesn't say whether the time when the temporary change is *not* in effect must be the legally-defined "standard time" or not, nor whether the "temporary change" must set the clock forward. POSIX appears not to disallow a TZ setting in which the "dst" offset is less than the "std" offset, unless I'm missing something, so, again, I'm not sure any standard forbids setting the clock *backwards* for "Daylight Saving Time".
On 2019-12-10 23:43, Guy Harris wrote:
POSIX appears not to disallow a TZ setting in which the "dst" offset is less than the "std" offset, unless I'm missing something, so, again, I'm not sure any standard forbids setting the clock *backwards* for "Daylight Saving Time".
I see this more as a problem of upward compatibility of tzdb, not of POSIX: for more than twenty years, the tzdb description in newctime.3 has guaranteed (in addition to what C and POSIX require) that "Tm_isdst is non-zero if summer time is in effect" and probably the "if" was even meant as "if and only if". This is no longer true since 2018c, but the interface description in newctime.3 was changed only in 2018f (without advert in NEWS). Some programs had relied on the old interface description, and have since adapted their functionality. For others, I do not know. For instance, the PHP date_format() function has a format specifier "I" for an indication of winter and summer time. Has the semantics of that format specifier changed? Or has it been removed? Due to the importance of tzdb, and its widespread use, even tiny changes in its external interfaces may cause compatibility problems. Using "rearguard" tzdb data can help in transitions. Michael Deckers.
On 12/11/19 10:12 AM, Michael H Deckers wrote:
the tzdb description in newctime.3 ... guaranteed (in addition to what C and POSIX require) that "Tm_isdst is non-zero if summer time is in effect" and probably the "if" was even meant as "if and only if".
I don't think that old wording was ever intended to mean that tm_isdst was to be set only in summer, or even only in daylight-saving periods that were used in summer and adjacent time periods. Such a meaning would have been contradicted by longstanding data, such as Belize's observance of DST from October through February in late 1918 through early 1943. Instead, it was merely sloppy wording where "summer time" was incorrectly used as a synonym for "daylight saving time", the wording that is in the relevant standards and that corresponds to the abbreviation "tm_isdst".
On 2019-12-11 18:39, Paul Eggert wrote:
Instead, it was merely sloppy wording where "summer time" was incorrectly used as a synonym for "daylight saving time", the wording that is in the relevant standards and that corresponds to the abbreviation "tm_isdst".
You are right, "summer time" would be a "sloppy wording" for winter time in Ireland. Michael Deckers.
On 2019-12-11 11:39, Paul Eggert wrote:
On 12/11/19 10:12 AM, Michael H Deckers wrote:
the tzdb description in newctime.3 ... guaranteed (in addition to what C and POSIX require) that "Tm_isdst is non-zero if summer time is in effect" and probably the "if" was even meant as "if and only if".
I don't think that old wording was ever intended to mean that tm_isdst was to be set only in summer, or even only in daylight-saving periods that were used in summer and adjacent time periods. Such a meaning would have been contradicted by longstanding data, such as Belize's observance of DST from October through February in late 1918 through early 1943.
Instead, it was merely sloppy wording where "summer time" was incorrectly used as a synonym for "daylight saving time", the wording that is in the relevant standards and that corresponds to the abbreviation "tm_isdst".
It is more likely that the term was correctly used in some other locale, where summer (and winter) time is often the legal wording in various languages, where the equivalents of "daylight saving" and "standard", used in North America, are not used or don't exist; sometimes the equivalents of "civil time" and "advanced time" are used instead, but nothing helps when standard is advanced relative to the alternative. I preferred whichever reference used the terms "standard time" and "alternative time", as that seems to be a more accurate wording for current (existing) usage. Of course, summer and winter time are also useful and accurate, but the offset relative values may not be as expected. Perhaps tm_isdst should always be set to -1 for inverted zones, where it does not represent summer or saving time, or even set it for all zones permanently to -1? -- Take care. Thanks, Brian Inglis, Calgary, Alberta, Canada This email may be disturbing to some readers as it contains too much technical detail. Reader discretion is advised.
Date: Thu, 12 Dec 2019 09:29:03 -0700 From: Brian Inglis <Brian.Inglis@SystematicSw.ab.ca> Message-ID: <4e191681-6342-a5d3-e539-945a50fd11c3@SystematicSw.ab.ca> | I preferred whichever reference used the terms "standard time" | and "alternative time", That was POSIX, before it was changed (not yet published, but applied, "alternative time" will no longer appear when the next edition is pubished) based upon some input from this direction. But, while I agree it was better, it's not good enough either - first because there's not just two, and second, because there's no point giving these things a name - whatever time it is (commonly agreed, whether legislatively backed or not) is "standard time" - that's what it means to be "standard". The only other thing we can (at least currently) rely upon is that there is a current offset from UTC (ie: that UTC is stable, and the time anywhere is, for a short while anyway, a constant offset from UTC - positive or negative (or 0)). Some juristictions provide names for the various times, mostly for convenience around the transitions, but none of them are very useful, and here we really cannot rely upon any such thing existing. tm_isdst was originally just intended as an index into tzname[] to get the (assumed always to exist) "name" of the current time (nb: not timezone). Both of the assumptions there are demonstrably false, first that there is a name to get, and second that there are just two (0 or 1 in tm_isdst). tzname[] is also an interface disaster, and should be retired, and if it is, the sole rational reason for tm_isdst to exist at all will vanish, and it can be deleted as well. [The addition of -1 as a tm_isdst value "don't know" when it is used as input to mktime() came later - as did mktime - that was a hack, still is]. But please people, we have had this discussion over and over, nothing is going to change anytime in the forseeable future, neither to correct any "bugs" that appear to exist (they don't - or not in this area) nor to actually make the interface more sane. There's no point continually discussing it. kre
On Fri, Dec 13, 2019 at 1:17 AM Robert Elz <kre@munnari.oz.au> wrote:
But please people, we have had this discussion over and over, nothing is going to change anytime in the forseeable future, neither to correct any "bugs" that appear to exist (they don't - or not in this area) nor to actually make the interface more sane.
There's no chance to change the existing posix calls. But there's a possibility to add a completely new set of calls which could eventually supersede the current set as the recommended way to handle time.
Date: Fri, 13 Dec 2019 03:54:00 +0100 From: Pierpaolo Bernardi <olopierpa@gmail.com> Message-ID: <CANY8u7F+z6-oH1N_GpEDye-yJt33Q1-6HofV1bXUk=LTydmp4A@mail.gmail.com> | There's no chance to change the existing posix calls. But there's a | possibility to add a completely new set of calls which could Yes, agreed. That would be the way forward. kre
On Tue, Dec 10, 2019 at 23:43, Guy Harris wrote: | I.e., it *is* DST, except that the "S" stands for "spending" rather than "saving". :-) :-) It's mostly a matter of documentation and expectation. I for one have been surprised by the recent change of interpretation to the extent that I thought this is a bug - despite having read the documentation to a reasonable depth in 2017. I've changed my programme to no longer use isdst (or rely on what I thought isdst meant). Can I suggest to make explicit in zdump's manpage (and other places) that isdst==1 doesn't necessarily mean clocks going forward. I believe that daylight saving time is widely understood to be synonymous with summer time: https://en.wikipedia.org/wiki/Daylight_saving_time Again, a simple case of noting in the documentation that this is not the intended interpretation here. Cheers Stefan I feel there were missed opportunities, too, when developing standards: Rather than foregrounding time difference to GMT and isdst (1 for DXT "daylight changed time", 0 for normal and -1 for unknown) one might have foregrounded the difference to what the region considers its normal time zone and the current offset owing to DXT, if any. Using the current interpretation the DXT offset would be -1 for Ireland and 1 for the UK (while Antarctica/Troll would be 2 and Lord Howe Island 0.5). That way the difference to GMT would be the sum of slow changing (geographic timezone) and potentially annually changing (DXT) offsets. In my mind this would have been more informative and easier to understand than isdst.
Stefan Rueger said:
I feel there were missed opportunities, too, when developing standards: Rather than foregrounding time difference to GMT and isdst (1 for DXT "daylight changed time", 0 for normal and -1 for unknown) one might have foregrounded the difference to what the region considers its normal time zone and the current offset owing to DXT, if any. Using the current interpretation the DXT offset would be -1 for Ireland and 1 for the UK (while Antarctica/Troll would be 2 and Lord Howe Island 0.5). That way the difference to GMT would be the sum of slow changing (geographic timezone) and potentially annually changing (DXT) offsets. In my mind this would have been more informative and easier to understand than isdst.
I don't disagree with you. The problem is that the original standards were mostly written by US people who assume that the whole world works their way. Where there's a clean sheet to start from, we can do better things [1], but that's a lot harder with a large installed base. The US-centric problem isn't new to this. I've lost count how many web sites can't cope with the fact that I don't have a zip code or that I have two middle initials, not one. I'm lucky that I don't have any non-ASCII characters in my name; there's a whole nother minefield waiting there. Allow me to point you at: https://www.kalzumeus.com/2010/06/17/falsehoods-programmers-believe-about-na... https://www.mjt.me.uk/posts/falsehoods-programmers-believe-about-addresses/ https://infiniteundo.com/post/25326999628/falsehoods-programmers-believe-abo... https://wiesmann.codiferes.net/wordpress/?p=15187&lang=en [1] For example, I've ensured that the Bluetooth Mesh standard uses TAI internally for all timestamps, leaving the leap second mess to the periphery. -- Clive D.W. Feather | If you lie to the compiler, Email: clive@davros.org | it will get its revenge. Web: http://www.davros.org | - Henry Spencer Mobile: +44 7973 377646
On 2019-12-13 15:13, Clive D.W. Feather wrote:
I don't disagree with you. The problem is that the original standards were mostly written by US people who assume that the whole world works their way. Where there's a clean sheet to start from, we can do better things [1], but that's a lot harder with a large installed base.
Is anyone aware of any plans to backport C++ work on chrono, date, tz, calendar, etc. APIs to POSIX and/or C?
The US-centric problem isn't new to this. I've lost count how many web sites can't cope with the fact that I don't have a zip code or that I have two middle initials, not one. I'm lucky that I don't have any non-ASCII characters in my name; there's a whole nother minefield waiting there.
Allow me to point you at: https://www.kalzumeus.com/2010/06/17/falsehoods-programmers-believe-about-na... https://www.mjt.me.uk/posts/falsehoods-programmers-believe-about-addresses/ https://infiniteundo.com/post/25326999628/falsehoods-programmers-believe-abo... https://wiesmann.codiferes.net/wordpress/?p=15187&lang=en
Comprehensive list of lists: https://github.com/kdeldycke/awesome-falsehood
[1] For example, I've ensured that the Bluetooth Mesh standard uses TAI internally for all timestamps, leaving the leap second mess to the periphery.
How does TAI get set given that mostly UTC is distributed and LS need to be added to get TAI? Or do you mean an arbitrary TAI timescale ignoring LS? -- Take care. Thanks, Brian Inglis, Calgary, Alberta, Canada This email may be disturbing to some readers as it contains too much technical detail. Reader discretion is advised.
Brian Inglis said:
[1] For example, I've ensured that the Bluetooth Mesh standard uses TAI internally for all timestamps, leaving the leap second mess to the periphery.
How does TAI get set given that mostly UTC is distributed and LS need to be added to get TAI? Or do you mean an arbitrary TAI timescale ignoring LS?
No, that's what I meant by "the periphery". The management device - perhaps your phone or in a big site one of the servers - has to know all about time zones and leap seconds. But if you want 87 lights to turn on at exactly 06:23:45 tomorrow, you don't need those 87 light bulbs to all have to manage time zone and leap second tables [1]. The main network just sends out times as TAI/UTC seconds since an epoch. [1] This isn't a joke. A major use case for Mesh is controlling lights in a large building without lots of cables to light switches: the switches can be little battery-powered things that sit on a desk or are stuck to the wall. -- Clive D.W. Feather | If you lie to the compiler, Email: clive@davros.org | it will get its revenge. Web: http://www.davros.org | - Henry Spencer Mobile: +44 7973 377646
On Dec 13, 2019, at 2:13 PM, Clive D.W. Feather <clive@davros.org> wrote:
[1] For example, I've ensured that the Bluetooth Mesh standard uses TAI internally for all timestamps, leaving the leap second mess to the periphery.
Yes, section 5.1.1 "Time" of https://www.bluetooth.org/docman/handlers/downloaddoc.ashx?doc_id=429634 specifies that the base representation of times is the number of seconds that have elapsed since 1999-12-31T23:59:28 UTC. :-) :-) :-) :-) :-) (I.e., the only way in which it "uses TAI" rather than using UTC is that the epoch is defined as an instant of time that has more zeroes in the way it's represented as TAI than in the way it's represented as UTC. If it had been defined as the number of seconds that have elapsed since 2000-01-01T00:00:00 UTC. that would still "leave the leap second mess to the periphery"; it would just mean that the count of seconds would be 32 seconds lower. By the way, how is TAI, represented as YYYY-MM-DDTHH:MM:SS TAI, defined? Does it have a Gregorian-style calendar made up entirely of 86400-second days, so that most years are exactly 31,536,000 seconds long, with some years being exactly 31,622,400 seconds long? If so, then there are at least four different flavors of time stamp, with "time stamp" defined as "label assigned to an instant of time": 1) "TAI-style calendar+clock", with a time stamp being a Gregorian-style YYYY-MM-DDTHH:MM:SS, with non-leap years being exactly 31,536,000 seconds long and leap years being exactly 31,622,400 seconds long, so that SS goes from 00 to 59, never to 60; 2) "UTC-style calendar+clock", with a time stamp being a Gregorian-style YYYY-MM-DDTHH:MM:SS, but with some years being 31,536,001 seconds, 31,622,401 seconds long, or 31,622,402 seconds long, depending on how many leap seconds there were during the year, so SS could occasionally go from 59 to 60; 3) counts of seconds since a specified epoch, with the count including leap seconds; 4) counts of non-leap seconds since a specified epoch. Note that 3) could use either a TAI-style calendar+clock or a UTC-style calendar+lock to specify the epoch; the same is true of 4).)
Guy Harris said:
[1] For example, I've ensured that the Bluetooth Mesh standard uses TAI internally for all timestamps, leaving the leap second mess to the periphery.
Yes, section 5.1.1 "Time" of
https://www.bluetooth.org/docman/handlers/downloaddoc.ashx?doc_id=429634
specifies that the base representation of times is the number of seconds that have elapsed since 1999-12-31T23:59:28 UTC. :-) :-) :-) :-) :-)
Yes. But ...
(I.e., the only way in which it "uses TAI" rather than using UTC is that the epoch is defined as an instant of time that has more zeroes in the way it's represented as TAI than in the way it's represented as UTC. If it had been defined as the number of seconds that have elapsed since 2000-01-01T00:00:00 UTC. that would still "leave the leap second mess to the periphery"; it would just mean that the count of seconds would be 32 seconds lower.
No, that's not the only way.
By the way, how is TAI, represented as YYYY-MM-DDTHH:MM:SS TAI, defined? Does it have a Gregorian-style calendar made up entirely of 86400-second days, so that most years are exactly 31,536,000 seconds long, with some years being exactly 31,622,400 seconds long?
Yes, and that's a big part of "avoiding the mess" - you don't worry about time zones *or* leap seconds.
If so, then there are at least four different flavors of time stamp, with "time stamp" defined as "label assigned to an instant of time":
1) "TAI-style calendar+clock", with a time stamp being a Gregorian-style YYYY-MM-DDTHH:MM:SS, with non-leap years being exactly 31,536,000 seconds long and leap years being exactly 31,622,400 seconds long, so that SS goes from 00 to 59, never to 60;
That's what Mesh uses when handling a "broken-down" time. Hence "TAI".
3) counts of seconds since a specified epoch, with the count including leap seconds;
And that's what Mesh uses when just counting seconds. I agree that it doesn't matter whether you could TAI seconds or UTC seconds, because they're exactly the same.
Note that 3) could use either a TAI-style calendar+clock or a UTC-style calendar+lock to specify the epoch; the same is true of 4).)
True. But that excludes your item 1. -- Clive D.W. Feather | If you lie to the compiler, Email: clive@davros.org | it will get its revenge. Web: http://www.davros.org | - Henry Spencer Mobile: +44 7973 377646
On Dec 14, 2019, at 1:23 AM, Clive D.W. Feather <clive@davros.org> wrote:
Guy Harris said:
[1] For example, I've ensured that the Bluetooth Mesh standard uses TAI internally for all timestamps, leaving the leap second mess to the periphery.
Yes, section 5.1.1 "Time" of
https://www.bluetooth.org/docman/handlers/downloaddoc.ashx?doc_id=429634
specifies that the base representation of times is the number of seconds that have elapsed since 1999-12-31T23:59:28 UTC. :-) :-) :-) :-) :-)
Yes. But ...
(I.e., the only way in which it "uses TAI" rather than using UTC is that the epoch is defined as an instant of time that has more zeroes in the way it's represented as TAI than in the way it's represented as UTC. If it had been defined as the number of seconds that have elapsed since 2000-01-01T00:00:00 UTC. that would still "leave the leap second mess to the periphery"; it would just mean that the count of seconds would be 32 seconds lower.
No, that's not the only way.
What's the other way?
By the way, how is TAI, represented as YYYY-MM-DDTHH:MM:SS TAI, defined? Does it have a Gregorian-style calendar made up entirely of 86400-second days, so that most years are exactly 31,536,000 seconds long, with some years being exactly 31,622,400 seconds long?
Yes, and that's a big part of "avoiding the mess" - you don't worry about time zones *or* leap seconds.
...unless you need to convert a Bluetooth Mesh time stamp to UTC. And the same would be true if it counted the number of seconds that have elapsed since 2000-01-01T00:00:00 UTC.
If so, then there are at least four different flavors of time stamp, with "time stamp" defined as "label assigned to an instant of time":
1) "TAI-style calendar+clock", with a time stamp being a Gregorian-style YYYY-MM-DDTHH:MM:SS, with non-leap years being exactly 31,536,000 seconds long and leap years being exactly 31,622,400 seconds long, so that SS goes from 00 to 59, never to 60;
That's what Mesh uses when handling a "broken-down" time. Hence "TAI".
Preview doesn't report the word "broken" anywhere in the spec. The times discussed in 5.1.4.2 Schedule Register presumably are, when no wildcards are involved, local dates/times, with a time stamp being a Gregorian-style YY-MM-DDTHH:MM:SS, and with non-leap years being either 31,536,000, 31,536,001, or 31,536,002 seconds long, and with leap years being either 31,622,400, 31,622,401, or 31,622,402 seconds long (which might disagree with most real-world clocks, as those clocks probably behave as if you had a UTC calendar and a clock that ticks like TAI, in the sense that it always goes from 23:59:59 to 00:00:00, but that's some number of seconds off from TAI). The Time state includes TAI-UTC Delta Current, TAI-UTC Delta New, and TAI of Delta Change; those require dealing with leap seconds.
3) counts of seconds since a specified epoch, with the count including leap seconds;
And that's what Mesh uses when just counting seconds. I agree that it doesn't matter whether you could TAI seconds or UTC seconds, because they're exactly the same.
I.e., nothing prevents such a clock from
Note that 3) could use either a TAI-style calendar+clock or a UTC-style calendar+lock to specify the epoch; the same is true of 4).)
True. But that excludes your item 1.
3) and 4) both completely exclude 1) and 2) - they're all different schemes for time-stamping instants of time, so 1) excludes 2), 3), and 4), 2) excludes 1), 3), and 4), 3) excludes 1), 2), and 4), and 4) excludes 1), 2) and 3). For 1) and 2), a time stamp would look like 2019-12-15T09:54:56; for 3) and 4), a time stamp would look like 1576317243. Both 3) and 4) could use either a 1)-style stamp or a 2)-style stamp in whatever spec defines the epoch. (The point is that, in a time stamping mechanism that stamps times with counts of seconds since an epoch, whether the count includes leap seconds is independent of whether the specification for the epoch includes the letters "T", "A", and "I" in sequence with no spaces between them or includes the letters "U", "T", and "C" in sequence with no spaces between them.)
Guy Harris wrote:
On Dec 14, 2019, at 1:23 AM, Clive D.W. Feather <clive@davros.org> wrote:
Guy Harris said:
...the only way in which it "uses TAI" rather than using UTC is that the epoch is defined as...
No, that's not the only way.
What's the other way?
I'm guessing that the other way boils down to (however you try to represent dates and times) *not* having any leap seconds anywhere -- that is, unambiguously working in TAI everywhere, and drifting farther away from UTC every time there's a leap second.
Yes, and that's a big part of "avoiding the mess" - you don't worry about time zones *or* leap seconds.
...unless you need to convert a Bluetooth Mesh time stamp to UTC.
Right. I imagine that's a pretty big problem.
The times discussed in 5.1.4.2 Schedule Register presumably are [...] with non-leap years being either 31,536,000, 31,536,001, or 31,536,002 seconds long
I wouldn't presume that at all. If it's doing TAI, if it's not counting leap seconds, why would you presume years of other than 31,536,000 or 31,622,400 seconds?
The point is that, in a time stamping mechanism that stamps times with counts of seconds since an epoch, whether the count includes leap seconds is independent of whether the specification for the epoch includes the letters "T", "A", and "I" ...
There may be additional subtleties here that I'm overlooking, but to me, this discussion cuts to the heart of the whole eternal leap second imbroglio. Guy has, I think, tried to carefully decouple two aspects of the question: A. Does your time scale count leap seconds or not, and B. Does your representation use broken-down times ("calendar + clock"), or monotonic counts of seconds since an epoch? I agree that those issues are orthogonal. (I'm also going to ignore for now the side question of whether the epoch mentioned in "B" is defined as a UTC or TAI timestamp.) So when Guy said "there are at least four different flavors of time stamp", I think his flavors map onto my two aspects A and B with this little truth table: A B no b-d 1) TAI-style calendar+clock yes b-d 2) UTC-style calendar+clock no mono 4) counts of non-leap seconds since a specified epoch yes mono 3) counts of seconds since a specified epoch, with the count including leap seconds Me, I would say that (1) and (4) are completely compatible -- they're two different representations for the same underlying time scale. You can perfectly interconvert between them using the same algorithm Posix defines for time_t. When it comes to UTC, however, (2) and (3) are most definitely not compatible. Personally, I believe that (2) is the *only* way to represent UTC. I believe -- Posix time_t most assuredly notwithstanding -- that (3) is an abomination. If you try to represent UTC using a monotonic count of seconds, you have to either (a) ignore leap seconds after all (as Posix does, but which totally contradicts those words "with the count including leap seconds" in the definition of (3)), or (b) use a kludgey and non-obvious (and non-Posix) mapping between your monotonic time and broken-down calendar+time, a mapping which always requires you to have a perfect leap second table available. And in fact, if you try to do (b), I believe that you are *not* really doing UTC, after all -- you are basically doing TAI (perhaps with a slightly different epoch), and misleadingly calling it UTC. You are engendering just as much confusion as the Posix time_t definition does. A monotonic count of seconds is simply not an appropriate representation for UTC. The only proper representation(s) of UTC are those that separate the date from the time, so that they can cleanly, honestly, openly, and explicitly accommodate those pesky days of other than 86400 seconds.
On Dec 14, 2019, at 8:00 AM, Steve Summit <scs@eskimo.com> wrote:
Guy Harris wrote:
On Dec 14, 2019, at 1:23 AM, Clive D.W. Feather <clive@davros.org> wrote:
Guy Harris said:
...the only way in which it "uses TAI" rather than using UTC is that the epoch is defined as...
No, that's not the only way.
What's the other way?
I'm guessing that the other way boils down to (however you try to represent dates and times) *not* having any leap seconds anywhere -- that is, unambiguously working in TAI everywhere, and drifting farther away from UTC every time there's a leap second.
Meaning, for example, that the time and date fields in 5.1.4.2 "Schedule Register" represent TAI times rather than local times? The time stamps described in 5.1.1 "Time" are "type 3" in my list - a count of all elapsed seconds, including leap seconds, since a given epoch. If you're keeping UTC - not the not-quite-UTC called "POSIX time" - you *also* need a count of all elapsed seconds, including leap seconds, since a given epoch, given that UTC time, like TAI time, changes even during positive leap seconds; it just happens to change in a different way. In TAI, it always goes, presumably, from {day} 23:59:59 TAI to {next day} 00:00:00 TAI, regardless of whether a positive leap second is inserted at that instant of time or not. In UTC, it goes from {day} 23:59:59 UTC to {same day} 23:59:60 UTC, and in the *next* second goes to {next day} 00:00:00 UTC. It's only in times like "POSIX time" where it stays stuck at 23:59:59, or is slowed down, or whatever form of tweaking is done to pretend that every day is 86400 seconds long; that's not done in real UTC.
Yes, and that's a big part of "avoiding the mess" - you don't worry about time zones *or* leap seconds.
...unless you need to convert a Bluetooth Mesh time stamp to UTC.
Right. I imagine that's a pretty big problem.
The times discussed in 5.1.4.2 Schedule Register presumably are [...] with non-leap years being either 31,536,000, 31,536,001, or 31,536,002 seconds long
I wouldn't presume that at all. If it's doing TAI, if it's not counting leap seconds, why would you presume years of other than 31,536,000 or 31,622,400 seconds?
Because, in that section, it's *not* doing TAI: 5.1.4.2 Schedule Register The Schedule Register state is a 16-entry, zero-based, indexed array of 76-bit values formatted as Scheduled Time. Each entry represents a state-changing event. *Time and date fields represent local time.*
The point is that, in a time stamping mechanism that stamps times with counts of seconds since an epoch, whether the count includes leap seconds is independent of whether the specification for the epoch includes the letters "T", "A", and "I" ...
There may be additional subtleties here that I'm overlooking, but to me, this discussion cuts to the heart of the whole eternal leap second imbroglio.
Guy has, I think, tried to carefully decouple two aspects of the question:
A. Does your time scale count leap seconds or not, and B. Does your representation use broken-down times ("calendar + clock"), or monotonic counts of seconds since an epoch?
What does "count leap seconds" mean in this context? If you have a B-style representation, it means "does that count increase during a positive leap second?" (we'll ignore negative leap seconds here). If you have an A-style representation, and it's TAI or UTC, it "counts leap seconds" in the sense that "the clock value changes even for the leap second and, if it overflows, the calendar value changes as well". The difference between TAI and UTC in that regard is in the fashion in which the clock value changes. The clock value changes in the same fashion, all the time, in TAI; it doesn't change in the same fashion, all the time, in UTC, *and the way in which it will change in the future is not predictable past a certain point* - you have to wait for an IERS Bulletin C to come out for that point before you can predict. So are A-style representations defined for both UTC and TAI? ITU-R Recommendation TF.460-6 says: 2 Leap-seconds 2.1 A positive or negative leap-second should be the last second of a UTC month, but first preference should be given to the end of December and June, and second preference to the end of March and September. 2.2 A positive leap-second begins at 23h 59m 60s and ends at 0h 0m 0s of the first day of the following month. In the case of a negative leap-second, 23h 59m 58s will be followed one second later by 0h 0m 0s of the first day of the following month (see Annex 3). but where 1) the form HHh Mmm SSs for time stamps specified for UTC, 2) is that also specified for TAI?, 3) is it stated anywhere how days are indicated in this context (YYYY-MM-DD? DD MMMMM YYYY?) What that Recommendation says is B International atomic time (TAI) The international reference scale of atomic time (TAI), based on the second (SI), as realized on the rotating geoid, is formed by the BIPM on the basis of clock data supplied by cooperating establishments. It is in the form of a continuous scale, e.g. in days, hours, minutes and seconds from the origin 1 January 1958 (adopted by the CGPM 1971). C Coordinated universal time (UTC) UTC is the time-scale maintained by the BIPM, with assistance from the IERS, which forms the basis of a coordinated dissemination of standard frequencies and time signals. It corresponds exactly in rate with TAI but differs from it by an integer number of seconds. The UTC scale is adjusted by the insertion or deletion of seconds (positive or negative leap-seconds) to ensure approximate agreement with UT1. which, at most, says that, *for example*, TAI is in the form of days (presumably days that are always exactly 86400 seconds long)), hours. minutes, and seconds from "the origin 1 January 1958" (does that mean "midnight, 1 January 1958"?). If, in fact, "days" correspond to exactly-86400-second-long days, and "the origin 1 January 1958" is "midnight, 1 January 1958", TAI could also be represented as a scale of seconds since the origin 1 January 1958, with conversion between those two scales being straightforward (division and remainder to go from the second scale to the first, multiplication and addition to go from the first scale to the second, given that there are no leap seconds, month lengths, or leap years to worry about). Can TAI also be represented as an absolute Gregorian-style date (year, month of year, day of month) and a time of hours, minute, and seconds since the beginning of that day? If so, are there any specified rules for that? (Conversion to or from that scale from either of the scales mentioned in the previous paragraph is a little more work, as you'd have to worry about month lengths and leap years). And in what ways could the UTC scale be represented? The "differs from it by an integral number of seconds", with the difference varying over time, would presumably be a difference between the "absolute Gregorian-style date (year, month of year, day of month) and a time of hours, minute, and seconds since the beginning of that day" representations of TAI or UTC, given that "It corresponds exactly in rate with TAI" means that, for the "days, hours, minutes and seconds from the origin 1 January 1958" definition of TAI, every one of those seconds occurs in UTC as well, so if you represent UTC as a count of days, hours, minutes, and seconds from some origin, or just as a count of seconds from that origin, the difference between the two scales, in seconds, is the difference between the origins, in seconds. Annex 3 of that recommendation, "Dating of events in the vicinity of a leap-second", shows events being dated "30 June, 23h 59m 60.6s UTC"; as it's an example, the year is omitted, so presumably the "30 June" would have a year number after it for a real event, i.e. an absolute Gregorian-style date.
Me, I would say that (1) and (4) are completely compatible -- they're two different representations for the same underlying time scale. You can perfectly interconvert between them using the same algorithm Posix defines for time_t.
When it comes to UTC, however, (2) and (3) are most definitely not compatible. Personally, I believe that (2) is the *only* way to represent UTC. I believe -- Posix time_t most assuredly notwithstanding -- that (3) is an abomination.
(2) and (3) are both problematic. With (2), time stamps for events defined to occur at a specified sufficiently large number of seconds in the future cannot be predicted. That includes all events with times specified by a (1)-style or (4)-style time stamp if those times go past what the IERS has specified. With (3), some time stamps cannot be specified - including time stamps for events in the *past*, such as an event occurring on 30 June 2015 at 23h 59m 60s UTC.
If you try to represent UTC using a monotonic count of seconds, you have to either (a) ignore leap seconds after all (as Posix does, but which totally contradicts those words "with the count including leap seconds" in the definition of (3)), or (b) use a kludgey and non-obvious (and non-Posix) mapping between your monotonic time and broken-down calendar+time, a mapping which always requires you to have a perfect leap second table available.
And in fact, if you try to do (b), I believe that you are *not* really doing UTC, after all -- you are basically doing TAI (perhaps with a slightly different epoch), and misleadingly calling it UTC.
If you are doing (a), you're not doing UTC either, given that POSIX can't distinguish between, say, 30 June 2015, 23h 59m 59s UTC and 30 June 2015, 23h 59m 60s UTC. (a) can't even do UTC correctly for *past* events. (b) *can* do UTC correctly for past events if you have a leap second table available that includes all announced leap seconds as of now - I think there may even be some sample code to do that. :-) What it can't do is handle UTC for *future* events, as a "perfect" leap second table requires perfect knowledge of the future, and we all what somebody (Karl Kristian Steincke?) said about predicting: http://quoteinvestigator.com/2013/10/20/no-predict/ TLDR: "you can't do UTC for arbitrary times in the future without running the risk of having to adjust something before the event". (Well, with sufficient thiotimoline used in your clock you might. :-))
You are engendering just as much confusion as the Posix time_t definition does. A monotonic count of seconds is simply not an appropriate representation for UTC. The only proper representation(s) of UTC are those that separate the date from the time, so that they can cleanly, honestly, openly, and explicitly accommodate those pesky days of other than 86400 seconds.
You cannot predict what the broken-down UTC label for an event N seconds in the future will be, if N is sufficiently large that it will be past a point at which a leap second might be introduced and for which the IERS hasn't announced whether there will be a leap second or not - the UTC label depends on whether a leap second will be inserted there or not. So UTC - and local civil time, if based on UTC - isn't a good way to label events sufficiently far in the future (as in, right now, any event happening at or after the end of 30 June 2020), if you can't live with being a few seconds off. So if we want to describe *any* timekeeping scheme that represents times in a format that is, or can trivially be converted to and from without loss of information, a count of seconds, including leap seconds, since some epoch, as "TAI", OK. That does *not*, of course, mean that the epoch needs to be expressed as a TAI time stamp; "seconds, including leap seconds, since 1970-01-01 00:00:00 UTC" would then be a representation of TAI (given that we know what instant of time was labeled 1970-01-01 00:00:00 UTC - no knowledge of the future is needed there).
The problem we have in the current discussion, is that once again, we're trying to over simplify time - make time work the way we think it should work, rather than the way it does. It makes no sense (to me) at all to convert TAI into any kind of (currently used) calendar type measurements - the calendars we use are designed to match astronomical reality (the various rotations of the earth) and those things just don't take constant periods, even though they're close. But that's what we have decided we want from our calendar system - we want it to alwqays be summer in January, and winter in July (those unfortunate enough to be in the wrong hemisphere can adapt as needed), and TAI (alone, withough some kinds of corrections) simply does not provide that. If you want some kind of calendar for TAI you'd be better to convert it to stardates or something. Certainly I don't think it makes any rational sense to have different length "years" (regular years and leap years, with a bizarre rule about which years are which) in TAI - though it is certainly sensible to have a unit bigger than seconds in which to measure lengthy periods (though I doubt anything approximating a year is really big enough - TAI is useful for measuring short durations (something we would conventionally count in tiny fractions of a second up to hours), and very long ones (millions of years, and longer), but not so much for anything in between). What we really need to do is recognise that there is not (rather should not be) just one definition (one unit) "a second" - there are two entirely different things we use commonly, which just happen to be so close together that we have tried to force them to be the same thing. One is the thing TAI uses (and UTC with leap second corrections) - which is a period defined by something that we believe to be fixed - always the exact same duration, and so can be used for comparing durations from measurements collected separately, The other is 1/84600 of a day (a bizarre choice, but never mind) where a day is defined according to the (varying) rotations of the earth, and hence the second is not a fixed length, but a variable one. That's the POSIX second (time_t) value, but is also how humans have measured time ever since we started doing it - initially it wasn't understood that the length of a day wasn't constant (as it so nearly is) but 60 seconds a minute, 60 minutes an hour, 24 hours a day, (ie: 86400 seconds a day) is what has been used for a long time now, (with the more precise measurements appearing as we gained the ability to measure them, sundials to be able to measure house, hourglasses to meaure minutes, and fractions thereof somtimes, ...) We really should be using two different names for these two different things, rather than trying to pretend that a "second" can be defined which suits both purposes. But until a large enough fraction of the population need to routinely deal with the differences, I don't see that happening. Certainly it doesn't help to continually try and force one another to adopt only one way of measuring time, as there is (as long as we keep anything like our current calendars) no way to make that work simply - nothing anyone would be prepared to attempt to implement - that is even if one can accept that any current calendar calculations are even close to simple. kre
Robert Elz wrote:
It makes no sense (to me) at all to convert TAI into any kind of (currently used) calendar type measurements -
Fair enough.
Certainly I don't think it makes any rational sense to have different length "years" (regular years and leap years, with a bizarre rule about which years are which) in TAI - though it is certainly sensible to have a unit bigger than seconds in which to measure lengthy periods...
Vernor Vinge has a wonderful set of SF novels ("A Deepness in the Sky", etc.), set in the far future, in which they measure time in kiloseconds, megaseconds, etc. There's also this marvelous passage concerning legacy software: Via a million million circuitous threads of inheritance, many of the oldest programs still ran in the bowels of the Qeng Ho system. Take the Traders' method of timekeeping. The frame corrections were incredibly complex -- and down at the very bottom of it was a little program that ran a counter. Second by second, the Qeng Ho counted time from the instant that a human had first set foot on Old Earth's moon. But if you looked at it still more closely... the starting instant was actually about fifteen million seconds later, the 0-second of one of Humankind's first computer operating systems.
What we really need to do is recognise that there is not [...] just one definition (one unit) "a second" - there are two entirely different things we use commonly [...] One is the thing TAI uses [...] The other is 1/84600 of a day [...] where a day is defined according to the (varying) rotations of the earth, and hence the second is not a fixed length, but a variable one. That's the POSIX second (time_t) value [...]
I don't entirely disagree, and indeed bringing UT1 to bear is an attractive way of trying to rescue the benighted Posix definition, but time_t is certainly not defined that way now, and I'm not aware of anyone implementing it that way! time_t seconds really are UTC/TAI seconds (except, of course, when they're not). [But perhaps what you're saying is that redefining time_t as UT1 is part of "What we really need to do".]
Date: Sat, 14 Dec 2019 19:56:42 -0500 From: scs@eskimo.com (Steve Summit) Message-ID: <2019Dec14.1956.scs.0008@quinine2.home> | [But perhaps what you're saying is that redefining time_t as UT1 | is part of "What we really need to do".] First, part ofmy belief is that nothing that POSIX (or the C std) has related to time is even close to adequate. For time_t what I mean is that it should not survive as it is, we need (just for "seconds" time counters) two entirely different things, which keep track of the different ways we count time, differently. Currently this is fudged with how one obtains the time_t (CLOCK_MONOTONIC, CLOCK_REALTIME etc) but because they all produce the same type (time_t) they can all be processed the same way - so a CLOCK_MONOTONIC result can be handed to localtime() which makes no rational sense at all. kre
On 15 Dec 2019, at 05:16, Robert Elz <kre@munnari.OZ.AU> wrote:
Date: Sat, 14 Dec 2019 19:56:42 -0500 From: scs@eskimo.com (Steve Summit) Message-ID: <2019Dec14.1956.scs.0008@quinine2.home>
| [But perhaps what you're saying is that redefining time_t as UT1 | is part of "What we really need to do".]
First, part ofmy belief is that nothing that POSIX (or the C std) has related to time is even close to adequate.
For time_t what I mean is that it should not survive as it is, we need (just for "seconds" time counters) two entirely different things, which keep track of the different ways we count time, differently. Currently this is fudged with how one obtains the time_t (CLOCK_MONOTONIC, CLOCK_REALTIME etc) but because they all produce the same type (time_t) they can all be processed the same way - so a CLOCK_MONOTONIC result can be handed to localtime() which makes no rational sense at all.
kre
I completely agree. IIRC, Posix defines the time returned by various things as "since approximately 1970-01-01 00:00:00 UTC" or "since an undefined epoch". Basically, the first time_t roughly gives the current wall-clock time in the Greenwich observatory in northern hemisphere winter (you can offset it to whatever timezone you think you are in). The second time_t gives you a reliable way of measuring elapsed time[1]. The central assumption of posix time-of-day is that a day is exactly 86400 seconds which is handy if you're doing quick and dirty coding: but how long does sleep(86400) sleep for over a leap second? (There are good, bad and ugly possible answers. And accurate ones.). I think Go has the right idea. A time.Time object works for both elapsed and wall-clock time by holding two possible values. So when there's been a leap second, noon-to-noon should show an elapsed time of 86401 seconds (I wonder if it does?) and if you want to sleep() from noon to noon you need to sleep the calculated interval, not 86400. It's a mess. jch [1] a certain database is, finally, using that to measure elapsed time.
On Dec 14, 2019, at 9:16 PM, Robert Elz <kre@munnari.OZ.AU> wrote:
For time_t what I mean is that it should not survive as it is, we need (just for "seconds" time counters) two entirely different things, which keep track of the different ways we count time, differently. Currently this is fudged with how one obtains the time_t (CLOCK_MONOTONIC, CLOCK_REALTIME etc) but because they all produce the same type (time_t) they can all be processed the same way - so a CLOCK_MONOTONIC result can be handed to localtime() which makes no rational sense at all.
For the purposes of logging events as they happen (meaning that, assuming the clock isn't incorrectly set ahead, all time stamps quickly become time stamps in the past), a value that 1) is a count of seconds (and perhaps fractional seconds) that have elapsed since some defined epoch, including leap seconds, and 2) can be converted to YYYY-MM-DD HH:MM:SS.SSSS is useful. That way, 1) you can easily calculate time deltas between those events and 2) you can find out when a particular event happened. (Some may already have guessed that I'm thinking of, among other things, network traffic captures here.)
On 2019-12-14 17:56, Steve Summit wrote:
Vernor Vinge has a wonderful set of SF novels ("A Deepness in the Sky", etc.), set in the far future, in which they measure time in kiloseconds, megaseconds, etc. There's also this marvelous passage concerning legacy software:
Via a million million circuitous threads of inheritance, many of the oldest programs still ran in the bowels of the Qeng Ho system. Take the Traders' method of timekeeping. The frame corrections were incredibly complex -- and down at the very bottom of it was a little program that ran a counter. Second by second, the Qeng Ho counted time from the instant that a human had first set foot on Old Earth's moon. But if you looked at it still more closely... the starting instant was actually about fifteen million seconds later, the 0-second of one of Humankind's first computer operating systems.
Should not make assumptions about other cultures, as they may be upset. Now, would more likely have been 1.5 billion seconds later: $ TZ=UTC date -d 2019-01-03\ 02:26 +%s # Chang'e 4 landing 1546482360 -- Take care. Thanks, Brian Inglis, Calgary, Alberta, Canada This email may be disturbing to some readers as it contains too much technical detail. Reader discretion is advised.
On 2019-12-15 16:52, Brian Inglis wrote:
Should not make assumptions about other cultures, as they may be upset. [Upset meaning rendered incorrect rather than feeling annoyed.] Now, would more likely have been 1.5 billion seconds later: $ TZ=UTC date -d 2019-01-03\ 02:26 +%s # Chang'e 4 landing 1546482360
-- Take care. Thanks, Brian Inglis, Calgary, Alberta, Canada This email may be disturbing to some readers as it contains too much technical detail. Reader discretion is advised.
On 2019-12-15 00:04, Robert Elz wrote:
It makes no sense (to me) at all to convert TAI into any kind of (currently used) calendar type measurements - the calendars we use are designed to match astronomical reality (the various rotations of the earth) and those things just don't take constant periods, even though they're close. But that's what we have decided we want from our calendar system - we want it to alwqays be summer in January, and winter in July (those unfortunate enough to be in the wrong hemisphere can adapt as needed), and TAI (alone, withough some kinds of corrections) simply does not provide that. If you want some kind of calendar for TAI you'd be better to convert it to stardates or something.
The notation of values of TAI using the Gregorian calendar is helpful when comparing time scales. For instance, in the recent BIPM definition of TAI, as sanctioned by the 26th CGPM 2018, online at [https://www.bipm.org/en/CGPM/db/26/2/], we read: "TT - TAI = 32.184 s exactly at 1 January 1977, 0h TAI at the geocentre, in order to ensure continuity of TT with Ephemeris Time"
.... What we really need to do is recognise that there is not (rather should not be) just one definition (one unit) "a second" - there are two entirely different things we use commonly, which just happen to be so close together that we have tried to force them to be the same thing.
One is the thing TAI uses (and UTC with leap second corrections) - which is a period defined by something that we believe to be fixed - always the exact same duration, and so can be used for comparing durations from measurements collected separately,
The other is 1/84600 of a day (a bizarre choice, but never mind) where a day is defined according to the (varying) rotations of the earth, and hence the second is not a fixed length, but a variable one. That's the POSIX second (time_t) value, but is also how humans have measured time ever since we started doing it - initially it wasn't understood that the length of a day wasn't constant (as it so nearly is) but 60 seconds a minute, 60 minutes an hour, 24 hours a day, (ie: 86400 seconds a day) is what has been used for a long time now, (with the more precise measurements appearing as we gained the ability to measure them, sundials to be able to measure house, hourglasses to meaure minutes, and fractions thereof somtimes, ...)
We really should be using two different names for these two different things, rather than trying to pretend that a "second" can be defined which suits both purposes.
I disagree. You are right that the time scales UT1 and TAI have different rates; and there are several other time scales used in astronomy (eg TCB, TCG, TDB) with still other rates. This is caused by physics (not just by the tradition of mean solar time derived from Earth rotation). Astronomers want to be able to compare these time scales and convert between them, hence they _must_ express their values in common units. Even the mere expression that the time scales differ in rate requires the use of common units. For instance, the differential quotient d(TAI - UT1)/d(UT1) is a measure of the speed of rotation of the Earth called "excess length of day" (it actually was negative, -20 µs/d on 2019-12-11). We cannot compute TAI - UT1 unless the values of TAI and UT1 are expressed in the same units. Still another non-SI time unit would only confuse the issue. Beyond that, the quantity "mean solar second" = d(TAI)/d(UT1)·(1 s) would be a bad choice for a unit since d(TAI)/d(UT1) changes unpredictably over time. In the presence of different time scales, it certainly is necessary to designate the time scale in which a specific datetime was observed; in ISO 8601 notation, the suffix Z indicates a UTC value, a suffixed offset indicates a local civil time scale derived from UTC; and otherwise, an appended time scale acronym (as above) is often used to indicate the time scale. Michael Deckers.
On Dec 15, 2019, at 5:22 AM, Michael H Deckers <michael.h.deckers@googlemail.com> wrote:
The notation of values of TAI using the Gregorian calendar is helpful when comparing time scales.
So how is that defined? Do you just take a UTC value for the same instant, add the current TAI - UTC delta to it - and, for overflow (meaning "resulting seconds > 59 or minutes > 59 or...), "carry into" the calendar date, so that an event that took place at the end of 2018, with a UTC label, took place at the beginning of 2019, with a TAI label?
On 2019-12-16 03:57, Guy Harris wrote:
On Dec 15, 2019, at 5:22 AM, Michael H Deckers <michael.h.deckers@googlemail.com> wrote:
The notation of values of TAI using the Gregorian calendar is helpful when comparing time scales.
So how is that defined?
Do you just take a UTC value for the same instant, add the current TAI - UTC delta to it - and, for overflow (meaning "resulting seconds > 59 or minutes > 59 or...), "carry into" the calendar date, so that an event that took place at the end of 2018, with a UTC label, took place at the beginning of 2019, with a TAI label?
Yes. If TAI was 1977-01-01, then TAI - UTC was 15 s, so that UTC was 1977-01-01 - 15 s = 1976-12-31T23:59:45. A calendar date just denotes a point on the time axis; and a time scale assigns a point on the time axis to each point of a region of spacetime (on which the time scale is defined). That is a "time scale" in the astronomical sense of the word, where TCB, TCG, TDB, TT, UTC, UT1, UT2, and local civil times are time scales. A "geological time scale" is something else; IEC 60050 has additional meanings for "time scale". A calendar is not restricted to the notation of values of one specific time scale, and points on the time axis can also be denoted by other means: the notations 2000-01-01, JD 2451 544.5 MJD 40 587 + 946 684 800 s -50 a B.P. all denote the same point on the time axis. Michael Deckers.
Guy Harris said:
The notation of values of TAI using the Gregorian calendar is helpful when comparing time scales.
So how is that defined?
Do you just take a UTC value for the same instant, add the current TAI - UTC delta to it - and, for overflow (meaning "resulting seconds > 59 or minutes > 59 or...), "carry into" the calendar date, so that an event that took place at the end of 2018, with a UTC label, took place at the beginning of 2019, with a TAI label?
No. You take a Gregorian calendar (by default; in principle you could use Julian or something else) and use it to count *all* SI seconds, so that every day has exactly 86400 seconds in it. It works the same way as UT1, but without the unpredictable-length "seconds". You can convert that to UTC by subtracting the relevant (not "current") TAI-UTC delta and doing correct carries, remembering to handle the edge cases when the delta changes. -- Clive D.W. Feather | If you lie to the compiler, Email: clive@davros.org | it will get its revenge. Web: http://www.davros.org | - Henry Spencer Mobile: +44 7973 377646
On Mon 2019-12-16T14:10:21+0000 Clive D.W. Feather hath writ:
You take a Gregorian calendar (by default; in principle you could use Julian or something else) and use it to count *all* SI seconds, so that every day has exactly 86400 seconds in it. It works the same way as UT1, but without the unpredictable-length "seconds".
In 1968 between a pair of meetings the atomic clock and CCIR folks ascertained that they could not use that scheme in broadcast time signals, and that is when and why the notion of the leap second came into being. -- Steve Allen <sla@ucolick.org> WGS-84 (GPS) UCO/Lick Observatory--ISB 260 Natural Sciences II, Room 165 Lat +36.99855 1156 High Street Voice: +1 831 459 3046 Lng -122.06015 Santa Cruz, CA 95064 https://www.ucolick.org/~sla/ Hgt +250 m
Steve Allen said:
On Mon 2019-12-16T14:10:21+0000 Clive D.W. Feather hath writ:
You take a Gregorian calendar (by default; in principle you could use Julian or something else) and use it to count *all* SI seconds, so that every day has exactly 86400 seconds in it. It works the same way as UT1, but without the unpredictable-length "seconds".
In 1968 between a pair of meetings the atomic clock and CCIR folks ascertained that they could not use that scheme in broadcast time signals,
*because* it drifts away from UT1, right? I'm not disputing that; I'm simply saying that it works in the same way, just using real SI seconds rather than sideral seconds. -- Clive D.W. Feather | If you lie to the compiler, Email: clive@davros.org | it will get its revenge. Web: http://www.davros.org | - Henry Spencer Mobile: +44 7973 377646
On 12/14/19 4:04 PM, Robert Elz wrote:
It makes no sense (to me) at all to convert TAI into any kind of (currently used) calendar type measurements - the calendars we use are designed to match astronomical reality
Well, sort of. Astronomers normally match astronomical reality only by using the technology of 45 BC, because they define a "year" to be the Julian year (31.5576 Ms) when, for example, they estimate the Big Bang as being 13.798 ± 0.037 billion years ago. Paleontologists are similar. Archaelogists too, I expect (at least for anything more than a few centuries old). For the Big Bang estimate, the difference between Julian and Gregorian is less than the confidence interval, so the distinction doesn't matter much. It's like when someone says "a million years ago", they don't much care whether this means a million years before 2019 or a million years before 2020. If we wanted to stick to "astronomical reality" it would not be a good idea to count Earth days, as the length of the Earth day was only four hours or so back when the the Earth was formed (at least, according to estimates by Takanori Sasaki <http://www.iea.usp.br/en/news/when-a-day-lasted-only-four-hours>). We'd be better off counting Earth years, as the length of Earth's year hasn't changed as much since then. Of course even then our clocks would go back only 4.5 billion years ago, and would stray a bit from astronomical years for other relatively-minor reasons (such as, the Sun's mass is constantly declining slightly, which means the Earth year length is constantly growing slightly).
Guy Harris said:
So are A-style representations defined for both UTC and TAI? ITU-R Recommendation TF.460-6 says:
2 Leap-seconds
2.1 A positive or negative leap-second should be the last second of a UTC month, but first preference should be given to the end of December and June, and second preference to the end of March and September.
2.2 A positive leap-second begins at 23h 59m 60s and ends at 0h 0m 0s of the first day of the following month. In the case of a negative leap-second, 23h 59m 58s will be followed one second later by 0h 0m 0s of the first day of the following month (see Annex 3).
but where 1) the form HHh Mmm SSs for time stamps specified for UTC, 2) is that also specified for TAI?, 3) is it stated anywhere how days are indicated in this context (YYYY-MM-DD? DD MMMMM YYYY?)
Hmm.
What that Recommendation says is
B International atomic time (TAI)
The international reference scale of atomic time (TAI), based on the second (SI), as realized on the rotating geoid, is formed by the BIPM on the basis of clock data supplied by cooperating establishments. It is in the form of a continuous scale, e.g. in days, hours, minutes and seconds from the origin 1 January 1958 (adopted by the CGPM 1971).
C Coordinated universal time (UTC)
UTC is the time-scale maintained by the BIPM, with assistance from the IERS, which forms the basis of a coordinated dissemination of standard frequencies and time signals. It corresponds exactly in rate with TAI but differs from it by an integer number of seconds.
The UTC scale is adjusted by the insertion or deletion of seconds (positive or negative leap-seconds) to ensure approximate agreement with UT1.
which, at most, says that, *for example*, TAI is in the form of days (presumably days that are always exactly 86400 seconds long)), hours. minutes, and seconds from "the origin 1 January 1958" (does that mean "midnight, 1 January 1958"?).
That says rather less than I expected. Equally, it doesn't say that much about UTC either. I would have thought "differs from it by an integer number of seconds" only makes sense if they use the same representation when calcualting the difference.
If, in fact, "days" correspond to exactly-86400-second-long days,
I think they do.
and "the origin 1 January 1958" is "midnight, 1 January 1958", TAI could also be represented as a scale of seconds since the origin 1 January 1958, with conversion between those two scales being straightforward (division and remainder to go from the second scale to the first, multiplication and addition to go from the first scale to the second, given that there are no leap seconds, month lengths, or leap years to worry about).
Okay, using a JD-style day count. But ...
Can TAI also be represented as an absolute Gregorian-style date (year, month of year, day of month) and a time of hours, minute, and seconds since the beginning of that day? If so, are there any specified rules for that? (Conversion to or from that scale from either of the scales mentioned in the previous paragraph is a little more work, as you'd have to worry about month lengths and leap years).
I think that was probably ignored as an issue, since "everyone knows" how the calendar works. The ISO 8601 rules for date sequences work perfectly well and there's no reason to re-state them.
And in what ways could the UTC scale be represented? The "differs from it by an integral number of seconds", with the difference varying over time, would presumably be a difference between the "absolute Gregorian-style date (year, month of year, day of month) and a time of hours, minute, and seconds since the beginning of that day" representations of TAI or UTC, given that "It corresponds exactly in rate with TAI" means that, for the "days, hours, minutes and seconds from the origin 1 January 1958" definition of TAI,
Okay so far.
every one of those seconds occurs in UTC as well,
But some of them with an xx:xx:60 label (and, in theory, some xx:xx:59 labels missing).
so if you represent UTC as a count of days, hours, minutes, and seconds from some origin, or just as a count of seconds from that origin, the difference between the two scales, in seconds, is the difference between the origins, in seconds.
I've always read it as the difference being the difference in broken-down time labels (xx:xx:05 v xx:xx:41, or whatever) and *not* a difference in seconds since some epoch.
Annex 3 of that recommendation, "Dating of events in the vicinity of a leap-second", shows events being dated "30 June, 23h 59m 60.6s UTC"; as it's an example, the year is omitted, so presumably the "30 June" would have a year number after it for a real event, i.e. an absolute Gregorian-style date.
More evidence that they viewed the calendar bit was "well-known".
Me, I would say that (1) and (4) are completely compatible -- they're two different representations for the same underlying time scale. You can perfectly interconvert between them using the same algorithm Posix defines for time_t.
When it comes to UTC, however, (2) and (3) are most definitely not compatible. Personally, I believe that (2) is the *only* way to represent UTC. I believe -- Posix time_t most assuredly notwithstanding -- that (3) is an abomination.
(2) and (3) are both problematic.
With (2), time stamps for events defined to occur at a specified sufficiently large number of seconds in the future cannot be predicted. That includes all events with times specified by a (1)-style or (4)-style time stamp if those times go past what the IERS has specified.
Of course. This is the basic "leap seconds problem".
With (3), some time stamps cannot be specified - including time stamps for events in the *past*, such as an event occurring on 30 June 2015 at 23h 59m 60s UTC.
And that's the Posix problem. [...]
So if we want to describe *any* timekeeping scheme that represents times in a format that is, or can trivially be converted to and from without loss of information, a count of seconds, including leap seconds, since some epoch, as "TAI", OK. That does *not*, of course, mean that the epoch needs to be expressed as a TAI time stamp; "seconds, including leap seconds, since 1970-01-01 00:00:00 UTC" would then be a representation of TAI (given that we know what instant of time was labeled 1970-01-01 00:00:00 UTC - no knowledge of the future is needed there).
Okay; I don't dispute that either. Bluetooth Mesh used a TAI 00:00:00 epoch to simplify the conversion between seconds-since-epoch and hh:mm:ss in people's minds. I accept that the only difference between that and a UTC 00:00:00 epoch is a constant offset equal to TAI-UTC at the moment of the epoch. -- Clive D.W. Feather | If you lie to the compiler, Email: clive@davros.org | it will get its revenge. Web: http://www.davros.org | - Henry Spencer Mobile: +44 7973 377646
On 2019-12-14 12:23, Guy Harris wrote:
On Dec 14, 2019, at 8:00 AM, Steve Summit wrote:
Guy Harris wrote:
On Dec 14, 2019, at 1:23 AM, Clive D.W. Feather wrote:
Guy Harris said:
ITU-R Recommendation TF.460-6 says:
2 Leap-seconds
2.1 A positive or negative leap-second should be the last second of a UTC month, but first preference should be given to the end of December and June, and second preference to the end of March and September.
2.2 A positive leap-second begins at 23h 59m 60s and ends at 0h 0m 0s of the first day of the following month. In the case of a negative leap-second, 23h 59m 58s will be followed one second later by 0h 0m 0s of the first day of the following month (see Annex 3).
but where 1) the form HHh Mmm SSs for time stamps specified for UTC, 2) is that also specified for TAI?, 3) is it stated anywhere how days are indicated in this context (YYYY-MM-DD? DD MMMMM YYYY?)
What that Recommendation says is
B International atomic time (TAI)
The international reference scale of atomic time (TAI), based on the second (SI), as realized on the rotating geoid, is formed by the BIPM on the basis of clock data supplied by cooperating establishments. It is in the form of a continuous scale, e.g. in days, hours, minutes and seconds from the origin 1 January 1958 (adopted by the CGPM 1971).
C Coordinated universal time (UTC)
UTC is the time-scale maintained by the BIPM, with assistance from the IERS, which forms the basis of a coordinated dissemination of standard frequencies and time signals. It corresponds exactly in rate with TAI but differs from it by an integer number of seconds.
The UTC scale is adjusted by the insertion or deletion of seconds (positive or negative leap-seconds) to ensure approximate agreement with UT1.
which, at most, says that, *for example*, TAI is in the form of days (presumably days that are always exactly 86400 seconds long)), hours. minutes, and seconds from "the origin 1 January 1958" (does that mean "midnight, 1 January 1958"?). If, in fact, "days" correspond to exactly-86400-second-long days, and "the origin 1 January 1958" is "midnight, 1 January 1958", TAI could also be represented as a scale of seconds since the origin 1 January 1958, with conversion between those two scales being straightforward (division and remainder to go from the second scale to the first, multiplication and addition to go from the first scale to the second, given that there are no leap seconds, month lengths, or leap years to worry about).
Just to muddy things further, in an article in "Polar Motion: Historical and Scientific Problems", ASP Conference Series, Vol. 208, also IAU Colloquium #178. Edited by Steven Dick, Dennis McCarthy, and Brian Luzum. (San Francisco: ASP) ISBN: 1-58381-039-0, 2000., "History of the Bureau International de l'Heure", Guinot, B., p.181: http://articles.adsabs.harvard.edu/pdf/2000ASPC..208..175G#page=7 states in the third paragraph: "By convention, all integrated atomic times, at BIH and elsewhere, were set so that an event at 1958 January 1, 0h UT2, receive the same date in UT2 and atomic time scales. However, the observatories used their own values of UT2: that explains that longitude errors of a few 0.01s appear in the local independent time scales, as they are presently realized." Reading that page and the next, it is apparent that AT, AM, A3, TA, TAI frequencies have been tweaked and steered to maintain consistency with earlier timescales and standards as well as between organizations. See also: https://www.ucolick.org/~sla/leapsecs/taiepoch.html and linked Bulletin Horaire scans, also: https://www.bipm.org/en/bipm-services/timescales/tai.html "The long-term stability of TAI is assured by weighting the participating clocks. The scale unit of TAI is kept as close as possible to the SI second by using data from those national laboratories which maintain the best primary caesium standards." so all should bear in mind that TAI is a synthetic timescale calculated and adjusted in arrears, so shares some of the same problems as leap seconds but occurring at higher precisions. -- Take care. Thanks, Brian Inglis, Calgary, Alberta, Canada This email may be disturbing to some readers as it contains too much technical detail. Reader discretion is advised.
On 2019-12-16 19:03, Brian Inglis wrote:
Just to muddy things further, in an article in "Polar Motion: Historical and Scientific Problems", ASP Conference Series, Vol. 208, also IAU Colloquium #178. Edited by Steven Dick, Dennis McCarthy, and Brian Luzum. (San Francisco: ASP) ISBN: 1-58381-039-0, 2000., "History of the Bureau International de l'Heure", Guinot, B., p.181:
http://articles.adsabs.harvard.edu/pdf/2000ASPC..208..175G#page=7
Thanks for the reference!
...so all should bear in mind that TAI is a synthetic timescale calculated and adjusted in arrears, so shares some of the same problems as leap seconds but occurring at higher precisions.
"adjusted in arrears": The definitive values of TAI are fixed about a month after the fact (fast in astronomical terms!) by Circular T. A more uniform time scale can be obtained when more than a month's future data are used; such evaluations are done by the BIPM and published yearly as estimates TT(BIPM xx) of TT. "same problems as leap seconds": I do not think so. That is, no astronomical observations (eg, satellite orbits, pulsar timings) can currently reach or exceed the accuracy of the determination of TT with earth-bound clocks, and the upcoming use of optical clocks will lead to a significant increase in accuracy. Nor is there any sign of a systematic difference in rate between TAI and TT -- it would be "new physics", I guess. Michael Deckers.
Steve Summit said:
The times discussed in 5.1.4.2 Schedule Register presumably are [...] with non-leap years being either 31,536,000, 31,536,001, or 31,536,002 seconds long
I wouldn't presume that at all. If it's doing TAI, if it's not counting leap seconds, why would you presume years of other than 31,536,000 or 31,622,400 seconds?
Those ones are in local time, so have years of varying lengths (*cough* 1751 and 1752 *cough*).
Guy has, I think, tried to carefully decouple two aspects of the question:
A. Does your time scale count leap seconds or not, and
This depends on what you mean by a leap second. And that is part of the problem.
B. Does your representation use broken-down times ("calendar + clock"), or monotonic counts of seconds since an epoch?
I agree that those issues are orthogonal. (I'm also going to ignore for now the side question of whether the epoch mentioned in "B" is defined as a UTC or TAI timestamp.)
No. They look orthogonal at first glance, but they aren't.
So when Guy said "there are at least four different flavors of time stamp", I think his flavors map onto my two aspects A and B with this little truth table:
A B no b-d 1) TAI-style calendar+clock yes b-d 2) UTC-style calendar+clock no mono 4) counts of non-leap seconds since a specified epoch yes mono 3) counts of seconds since a specified epoch, with the count including leap seconds
[I'm wondering if that intended to have 3 before 4, based on the following text.] What I would have is, rather, the following: C: Does your time scale count every second exactly once? A "no" answer means that some seconds are not counted or are counted twice. D: Does your time scale: (1) use a monotonic count of seconds since some epoch (2) use a broken-down style with the seconds field always stepping from 59 to 00 at the end of the minute (3) use a broken-down style with the seconds field stepping from 58, 59, or 60 to 00 at the end of the minute, depending on the exact date-time. C yes D 1 is a count of SI seconds since the epoch. I called it a count of TAI seconds to avoid people confusing it with: C no D 1, which is Posix time_t and which some people erroneously think is a count of UTC seconds since the epoch. (It is approximately a count of UT1 seconds since the epoch, but I don't want to get into the SI versus sideral seconds debate). C yes D 2 is TAI broken-down format. C no D 2 is Posix broken-down format. C yes D 3 is UTC broken-down format. C no D 3 is, I hope, never used.
Me, I would say that (1) and (4) are completely compatible -- they're two different representations for the same underlying time scale. You can perfectly interconvert between them using the same algorithm Posix defines for time_t.
That algorithm can convert between D 1 and D 2 for either C case, so long as you don't mix them up.
When it comes to UTC, however, (2) and (3) are most definitely not compatible. Personally, I believe that (2) is the *only* way to represent UTC. I believe -- Posix time_t most assuredly notwithstanding -- that (3) is an abomination.
UTC is C yes D 3.
If you try to represent UTC using a monotonic count of seconds, you have to either (a) ignore leap seconds after all (as Posix does, but which totally contradicts those words "with the count including leap seconds" in the definition of (3)), or (b) use a kludgey and non-obvious (and non-Posix) mapping between your monotonic time and broken-down calendar+time, a mapping which always requires you to have a perfect leap second table available.
Yes. I agree.
And in fact, if you try to do (b), I believe that you are *not* really doing UTC, after all -- you are basically doing TAI (perhaps with a slightly different epoch), and misleadingly calling it UTC. You are engendering just as much confusion as the Posix time_t definition does. A monotonic count of seconds is simply not an appropriate representation for UTC. The only proper representation(s) of UTC are those that separate the date from the time, so that they can cleanly, honestly, openly, and explicitly accommodate those pesky days of other than 86400 seconds.
I think I agree with that as well. -- Clive D.W. Feather | If you lie to the compiler, Email: clive@davros.org | it will get its revenge. Web: http://www.davros.org | - Henry Spencer Mobile: +44 7973 377646
Guy Harris said:
On Dec 14, 2019, at 1:23 AM, Clive D.W. Feather <clive@davros.org> wrote:
Guy Harris said:
[1] For example, I've ensured that the Bluetooth Mesh standard uses TAI internally for all timestamps, leaving the leap second mess to the periphery.
Yes, section 5.1.1 "Time" of
https://www.bluetooth.org/docman/handlers/downloaddoc.ashx?doc_id=429634
It's a while since I was involved in Mesh, and it's possible that they've managed to mess things up since then. But, let's see ...
specifies that the base representation of times is the number of seconds that have elapsed since 1999-12-31T23:59:28 UTC. :-) :-) :-) :-) :-)
Yes. But ...
(I.e., the only way in which it "uses TAI" rather than using UTC is that the epoch is defined as an instant of time that has more zeroes in the way it's represented as TAI than in the way it's represented as UTC. If it had been defined as the number of seconds that have elapsed since 2000-01-01T00:00:00 UTC. that would still "leave the leap second mess to the periphery"; it would just mean that the count of seconds would be 32 seconds lower.
Okay, that statement does indeed pick a convenient epoch for counting seconds. It means the the function to map the 40-bit seconds count (and 8 bit subsecond count) to a "broken-down" time in TAI is easy.
By the way, how is TAI, represented as YYYY-MM-DDTHH:MM:SS TAI, defined? Does it have a Gregorian-style calendar made up entirely of 86400-second days, so that most years are exactly 31,536,000 seconds long, with some years being exactly 31,622,400 seconds long? Yes, and that's a big part of "avoiding the mess" - you don't worry about time zones *or* leap seconds. ...unless you need to convert a Bluetooth Mesh time stamp to UTC.
Which is done "on the periphery". Note the use of zero in various fields to indicate that the device doesn't have zone or leap second information.
And the same would be true if it counted the number of seconds that have elapsed since 2000-01-01T00:00:00 UTC.
Also true, but it seems cleaner to have the epoch be a round point in TAI, rather than in UTC, given that you're using TAI internally. Note the following paragraph: "To allow Mesh devices to refer to UTC or local times, devices need to be aware of the past, present, and predicted changes to the TAI-UTC Delta (the number of seconds between TAI and UTC) and to the local time zone offset (e.g., in Seattle, USA, the local time is exactly 7 hours behind UTC for part of the year and 8 hours behind UTC for the rest of the year). Because these two values can change at any time for physical or political reasons, they are not hard-coded into this specification. Instead, they are communicated to all nodes in the mesh provided that at least one device has the information." More "periphery" stuff, since most uses won't use those offsets. An algorithm is also given for converting TAI seconds since the epoch to a UTC "broken-down" time. Setting E and F to zero will give you a TAI "broken-down" time.
1) "TAI-style calendar+clock", with a time stamp being a Gregorian-style YYYY-MM-DDTHH:MM:SS, with non-leap years being exactly 31,536,000 seconds long and leap years being exactly 31,622,400 seconds long, so that SS goes from 00 to 59, never to 60;
That's what Mesh uses when handling a "broken-down" time. Hence "TAI".
Preview doesn't report the word "broken" anywhere in the spec.
The term "broken-down time" is common - or, at least, I thought it was - to refer to the "YYYY-MM-DD hh:mm:ss" format as opposed to "(some) (sort-of) seconds past the epoch". But it's possible that they stopped doing this after I stopped being involved.
The times discussed in 5.1.4.2 Schedule Register presumably are, when no wildcards are involved, local dates/times, with a time stamp being a Gregorian-style YY-MM-DDTHH:MM:SS, and with non-leap years being either 31,536,000, 31,536,001, or 31,536,002 seconds long, and with leap years being either 31,622,400, 31,622,401, or 31,622,402 seconds long (which might disagree with most real-world clocks, as those clocks probably behave as if you had a UTC calendar and a clock that ticks like TAI, in the sense that it always goes from 23:59:59 to 00:00:00, but that's some number of seconds off from TAI).
The Scheduler is part of the app you run on your phone, or whatever. As it says, the fields are in local time.
The Time state includes TAI-UTC Delta Current, TAI-UTC Delta New, and TAI of Delta Change; those require dealing with leap seconds.
But those are only generated by the device that knows them and are only used to present values in local time or to convert local time to SI seconds past the epoch. Most devices in the mesh don't do that. The Scheduler app converts the local times into TAI-past-the-epoch to send out "turn on at this time" messages.
3) counts of seconds since a specified epoch, with the count including leap seconds;
And that's what Mesh uses when just counting seconds. I agree that it doesn't matter whether you could TAI seconds or UTC seconds, because they're exactly the same.
s/could/call them/ In fact, "SI seconds" might have been better. One reason for using "TAI" rather than "UTC" is that it makes it clear that all seconds are included rather than the Posix perversion of calling it UTC and then ignoring some of them.
(The point is that, in a time stamping mechanism that stamps times with counts of seconds since an epoch, whether the count includes leap seconds is independent of whether the specification for the epoch includes the letters "T", "A", and "I" in sequence with no spaces between them or includes the letters "U", "T", and "C" in sequence with no spaces between them.)
True, *if* people don't misunderstand. -- Clive D.W. Feather | If you lie to the compiler, Email: clive@davros.org | it will get its revenge. Web: http://www.davros.org | - Henry Spencer Mobile: +44 7973 377646
participants (11)
-
Brian Inglis -
Clive D.W. Feather -
Guy Harris -
John Haxby -
Michael H Deckers -
Paul Eggert -
Pierpaolo Bernardi -
Robert Elz -
scs@eskimo.com -
Stefan Rueger -
Steve Allen