comments on draft-newman-datetime-00.txt
I liked Markus Kuhn's comments on your Internet Draft ``Date and Time on the Internet'' <URL:ftp://ds.internic.net/internet-drafts/draft-newman-datetime-00.txt> (December 1996). I have the following further comments and suggestions, some of them echoing Kuhn, some providing more details. Please let me know if you have any questions about these comments, and I'd appreciate hearing about any updates to your draft. Section 2 This section should mention that all dates use the Gregorian calendar. Section 3 Section 3 suggests that 00-49 should map to 2000-2049, and that 50-99 should map to 1950-1999. This contradicts existing practice in one important set of implementations: the XPG4 standard says that 00-68 should map to 2000-2068, and 69-99 should map to 1969-1999. This accommodates nonnegative Posix timestamps which start at 1970-01-01 00:00:00 UTC (equivalent to 1969-12-31 in some time zones). Any suggestion for mapping 2-digit to 4-digit dates is bound to be controversial. Perhaps it would be best to remove this recommendation entirely. If you must suggest a range, the XPG4 range is a bit better than the range in the December draft, as it matches existing practice better (and it will give you another 19 years to upgrade your software...). The recommendation ``Three digit years MUST be interpreted by adding 1900'' seems to be designed only for interfacing to buggy software (e.g. software that uses C's tm_year value without changing it). It is also incompatible with ISO 8601, which uses 3-digit strings for day-of-year. I suggest removing this recommendation. Section 3 mentions ``two digit'' years and ``three digit'' years without making it clear that we are talking about lexical syntax, not numeric value. This should be clarified. For example, the string `0096' should denote the year 96, not 1996; otherwise there would be no way to specify the year 96. Section 4.1 A good reference for local time zone rules is the tz database maintained by Olson et al (<URL:ftp://elsie.nci.nih.gov/pub/>, updated regularly). Section 4.2 When the local offset is unknown, the offset "-00:00" MAY be used to indicate that the time is in UTC and the local offset is unknown. This is worded a little confusingly -- could you please clarify? Is it common to have situations where UTC is known but local time isn't? Without more motivation, it's hard to see why this suggestion is needed. Section 4.3 An alternative would be to show a list of the timezone labels defined in [section XXX]. I'm not sure what is meant here. Surely by ``timezone label'' you do not mean commonly used strings like `EST', since such strings are ambiguous in practice. E.g. `EST' has one meaning in the US, another in Brazil, and yet another in Australia (where the meaning of `EST' also depends on whether it is winter or summer!). Also, there is controversy about what the time zone labels ought to be -- e.g. should Canadian Eastern Standard Time be called `EST' (English) or `HNE' (French)? If by ``timezone label'' you mean some other identifying label, then I suggest using the Posix TZ string as extended by the Olson package (again, see <URL:ftp://elsie.nci.nih.gov/pub/>). This is common practice, is widely used, and there are multiple implementations; e.g. the source code to another implementation of the client library for the Olson TZ extensions can be found in <URL:ftp://prep.ai.mit.edu/pub/gnu/glibc-1.09.1.tar.gz>. For example, the Olson TZ string `America/Los_Angeles' stands for local time in Los Angeles; the Posix TZ string `CST-8' stands for the time zone named `CST' that is always 8 hours ahead of UTC. The advantage of `America/Los_Angeles' is that it can work correctly for all times in the past (the tz database has entries back to 1883, when Los Angeles adopted standard time), and requires no user modification to work correctly in the future, as the tables can be updated automatically by administrators as needed. Section 5.4 time-hour's range should be 00-23, not 00-24. ISO 8601 allows 24 but people are sometimes confused by it, it makes sorting a bit trickier, and it provides no useful functionality in this context. time-second allows leap seconds (value 60), but many systems do not support leap seconds. (E.g. Posix requires that the host not support leap seconds in time_t values.) The RFC should recommend what to do on a system that lacks leap second support when it is given a time stamp containing a leap second. I suggest that such systems treat values >= 60 as 59.999... (the number of 9s after the decimal point being the maximum allowed by the host). time-secfrac should use "." for the decimal point, as this is more commonly used in computer applications. Usurping "," for the decimal point might cause problems in applications that use "," for other punctuation. ISO 8601 allows "." for decimal point. time-numzone should not require specification of minutes (and seconds) as they are normally zero. E.g. change it to: time-numzone = ("+" / "-") time-hour [":" time-minute] date-time should not require a "T" between the date and the time; it should allow a space as well. This is easier to read, is allowed by ISO 8601, and is common practice. ISO 8601 provides no way to represent years before the year 0000, or after the year 9999. This makes it difficult to represent timestamps in some historical applications. To fix this, you might extend the syntax for date-fullyear to: date-fullyear = ["-"] 4*DIGIT where the years are numbered ..., -0002, -0001, 0000, 0001, 0002, .... (Note that unlike the traditional Julian calendar, there is a year 0 in the modern Gregorian calendar.) Section 6 I don't know of any ``IANA Registry of Timezone Names''. But please see my comments on Section 4.3 above for more details about an existing registry that could be used as the basis of an IANA registry. Appendix A Why is this section needed? Appendix B I don't see why this section is needed, since this draft RFC doesn't care about the day of the week. But if you think it's needed, here's the canonical reference for Zeller's congruence (written in German): Chr. Zeller, Kalender-Formeln, Acta Mathematica, Vol. 9, Nov. 1886 And here is code that is derived directly from Zeller's paper and uses Zeller's notation. #include <stdio.h> /* Return day of week, with 0 meaning Saturday and 1 meaning Sunday. See Chr. Zeller, Kalender-Formeln, Acta Mathematica, Vol. 9, Nov. 1886. */ int zeller (int year, int month, int day) { int jan_or_feb = month < 3; int y = year - jan_or_feb; int J = y / 100; /* century number */ int K = y % 100; /* year number within the same */ int m = month + 12 * jan_or_feb; /* month number */ int q = day; /* day number in the month */ int h = (q + ((m + 1) * 26) / 10 + K + K/4 + J/4 - 2*J) % 7; /* weekday number (1 is Sunday) */ return h; } char *dayofweek[] = { "Saturday", "Sunday", "Monday", "Tuesday", "Wednesday", "Thursday", "Friday" }; int main () { int year, month, day; printf("Enter the year (0001-9999): "); scanf("%d", &year); printf("\nEnter the month (1-12): "); scanf("%d", &month); printf("\nEnter the day of the month (1-31): "); scanf("%d", &day); printf("The day of the week is: %s\n", dayofweek[zeller (year, month, day)]); return 0; }
<<On Tue, 31 Dec 1996 15:18:33 -0800, Paul Eggert <eggert@twinsun.com> said:
When the local offset is unknown, the offset "-00:00" MAY be used to indicate that the time is in UTC and the local offset is unknown.
This is worded a little confusingly -- could you please clarify? Is it common to have situations where UTC is known but local time isn't? Without more motivation, it's hard to see why this suggestion is needed.
Certainly. If you're running NTP, for example, you know a fairly good approximation of UTC, but the local offset is still left to be manually configured. Or consider, for example, a large corporate host with users from many different parts of the world; while each user may have his own idea of the timezone, system administrators may choose to use UTC for system programs to ease administration. Programs which communicate with such a system should not assume that communications related to a particular user is dated with that user's preferential timezone. -GAWollman -- Garrett A. Wollman | O Siem / We are all family / O Siem / We're all the same wollman@lcs.mit.edu | O Siem / The fires of freedom Opinions not those of| Dance in the burning flame MIT, LCS, ANA, or NSA| - Susan Aglukark and Chad Irschick
In message <199612312318.PAA02746@shade.twinsun.com>, Paul Eggert wrote:
<URL:ftp://ds.internic.net/internet-drafts/draft-newman-datetime-00.txt>
I agree with Paul's remarks except where noted below.
Section 4.2
When the local offset is unknown, the offset "-00:00" MAY be used to indicate that the time is in UTC and the local offset is unknown.
This is worded a little confusingly -- could you please clarify? Is it common to have situations where UTC is known but local time isn't? Without more motivation, it's hard to see why this suggestion is needed.
On board a plane, in orbit, in a submarine, in any mobile device that receives time only by GPS or NTP, in my labtop that I carry with me during my next south pole tour, ... Since we can expect e.g., GSM cellular phones, LEO sat based phones, etc. that include a tiny keyboard and allow you to write e-mail whereever you are currently traveling, it is a not unrealistic assumption, that the device that submits your e-mail knows UTC, but not your local time.
ISO 8601 provides no way to represent years before the year 0000, or after the year 9999. This makes it difficult to represent timestamps in some historical applications. To fix this, you might extend the syntax for date-fullyear to:
date-fullyear = ["-"] 4*DIGIT
where the years are numbered ..., -0002, -0001, 0000, 0001, 0002, .... (Note that unlike the traditional Julian calendar, there is a year 0 in the modern Gregorian calendar.)
I think, this is definitely outside the scope of this draft. Remember: This draft specifies a date+time format with at least second precision. This is hardly something you would ever use for historic applications, where you often want to be able to represent that you only know the date but not the time, etc. The authors of applications you are talking about should look at the full ISO 8601 family of formats and not at draft-newman-datetime. In this draft, we have in mind timestamps for digital objects (e-mail, etc.) and not anything suitable to represent the time when some comet appeared >3000 years ago. The same argument applies to the 2-digit year conversion discussion: Birthdates will never be represented in the draft-newman-datetime format, because births are usually not recorded with second precision. (BTW: my recorded birth timestamp was 1971-01-01 10:11+01, so I really had to become a computer geek with these many 0s and 1s on the birth certificate ;-). However, I agree that ISO 8601 really should be extended by a specification of how negative dates and dates >9999 can be represented. But this is an issue for the ISO WG and not for a RFC. There are many other things which could be added to ISO 8601, for instance there is no distinguished notation for a MJD date and time, etc.
Appendix B
I don't see why this section is needed, since this draft RFC doesn't care about the day of the week. But if you think it's needed, here's the canonical reference for Zeller's congruence (written in German):
Chr. Zeller, Kalender-Formeln, Acta Mathematica, Vol. 9, Nov. 1886
It is probably a good idea to provide a general small list of references for calendar algorithms, as the readers of this standard are quite likely to have to implement algorithms like determine weeknumber, weekday, MDJ, difference in days between two dates, etc. One good reference might be (haven't checked it myself yet): Dershowitz & Reingold Calendrical Calculation Software Practice and Experience vol 20#9 & vol 23 #4 <URL:http://emr.cs.uiuc.edu/~reingold/calendars.html> Contains many algorithms in Lisp for converting between calendars, determining holidays all over the world, etc. They's also published a more detailed book version: Calendrical Calculations By Nachum Dershowitz and Edward M. Reingold. Cambridge University Press, 1997. ISBN 0-521-56413-1 (Hardback) ISBN 0-521-56474-3 (Paperback) Other references, some of which might also be worth being checked-out and mentioned: Efficient timestamp input and output Dyreson & Snodgrass SWP&E vol 24 #1, Jan 94, pp89-109 Larsen: Computing the Day of the Week DDJ, April 1995, pp125-126 Meyer: Julian and Gregorian Calandars DDJ, March 1993 In Ian Oliver, Programming Classics, Prentice-Hall 93, siehe chapter 3.2, pp.57-66. Credit: Some of this is taken from a list of date/time algorithm references that Prof. Karl Kleine <kleine@fh-jena.de> mailed me on 1995-09-12. Markus -- Markus G. Kuhn, Computer Science grad student, Purdue University, Indiana, USA -- email: kuhn@cs.purdue.edu
Date: Thu, 02 Jan 1997 16:45:07 -0500 From: kuhn@cs.purdue.edu ("Markus G. Kuhn")
Section 4.2
When the local offset is unknown, the offset "-00:00" MAY be used to indicate that the time is in UTC and the local offset is unknown.
This is worded a little confusingly -- could you please clarify? ....
On board a plane, in orbit, in a submarine, in any mobile device that receives time only by GPS or NTP, in my labtop that I carry with me during my next south pole tour, ... I should have stated my question more clearly. In some cases, local time may be defined but not known (e.g. a UTC-based timepiece at an unknown location); in other cases, local time itself may be undefined (an extreme example of this is the North Pole; there are other, more common, examples, e.g. aircraft flying over most of the Pacific, where there is no established convention for time zone boundaries or for whether daylight saving applies). I couldn't tell which case was intended from the original wording; I now see from other people's responses that the former was undoubtedly intended but it might be helpful to clarify the wording here. While we're on the subject, I think `-00:00' is a kludge and should be removed. How about the following convention instead? `Z' means the time is in UTC and the local UTC offset is unknown or undefined. `+00' means the time is in UTC and the local UTC offset is 0. This conveys the same information as the convention proposed in section 4.2 of draft-newman-datetime-00.txt, but it's easier to explain and is more consistent.
In message <199701022328.PAA05986@shade.twinsun.com>, Paul Eggert wrote:
While we're on the subject, I think `-00:00' is a kludge and should be removed. How about the following convention instead?
`Z' means the time is in UTC and the local UTC offset is unknown or undefine d. `+00' means the time is in UTC and the local UTC offset is 0.
This conveys the same information as the convention proposed in section 4.2 of draft-newman-datetime-00.txt, but it's easier to explain and is more consistent.
Excellent suggestion! This will make clear whether the timestamp was created in London or by a device that does not care about local time. Markus -- Markus G. Kuhn, Computer Science grad student, Purdue University, Indiana, USA -- email: kuhn@cs.purdue.edu
In message <199701022328.PAA05986@shade.twinsun.com>, Paul Eggert wrote: > While we're on the subject, I think `-00:00' is a kludge and should be > removed. How about the following convention instead? > > `Z' means the time is in UTC and the local UTC offset is unknown or undefined. > `+00' means the time is in UTC and the local UTC offset is 0. I'd very much prefer it if use of any kind of alphabetic codes for timezone indications were deprecated. They take special case code to handle, and are a nuisance. The -00:00 thing is a kludge OK, but it is one that works. Those few applications that actually want to know that the local time isn't (wasn't) really UTC can check for this, and recognise it, the vast majority which simply want to convert the time given to some standard reference (for collating, display in their local zone, etc) can simply parse the -00:00 the same as they would 00:00 (or +00:00, whichever it is) and don't need to worry about the difference, that the reported time is in UTC is all that matters to them, not why it is. While it is a kludge, it is a clever one. Date: Thu, 02 Jan 1997 18:46:02 -0500 From: kuhn@cs.purdue.edu ("Markus G. Kuhn") Message-ID: <199701022346.SAA01238@ector.cs.purdue.edu> Excellent suggestion! This will make clear whether the timestamp was created in London or by a device that does not care about local time. No, the +00:00 vs -00:00 can already achieve that, the 'Z' thing isn't necessary. And while I'm here, it is all very nice to follow 8061 and all that, but if the aim of this draft is really to make a spec for reporting times that can be used on the internet, it is probably more important that the current internet time specs be examined, and needless differences be avoided. Eg: The rfc822 (e-mail) way to report a numeric time zone is +nnnn (or -nnnn) - no colons. There's about as much hope of that ever changing as there is of redefining time to use a much more rational 100 seconds in a minute, etc. Writing a spec that won't be used isn't very productive. kre
In message <9150.852255095@munnari.OZ.AU>, Robert Elz wrote:
In message <199701022328.PAA05986@shade.twinsun.com>, Paul Eggert wrote:
> While we're on the subject, I think `-00:00' is a kludge and should be > removed. How about the following convention instead? > > `Z' means the time is in UTC and the local UTC offset is unknown or un
defined.
> `+00' means the time is in UTC and the local UTC offset is 0.
I'd very much prefer it if use of any kind of alphabetic codes for timezone indications were deprecated.
The 'Z' is not part of any deprecated code, it is the official ISO 8601 designation for times given in UTC. There are many applications where you are not interested in local time, and there, the 'Z' is a nice and short indicator that we use UTC. I do not like the idea to add a useless and ugly -00 in protocols where only UTC will be used. But a simple 'Z' to remind the casual viewer that this is UTC won't do any harm. Historically, the 'Z' was derived from a NATO letter time-zone code, which is considered in the computer community to be deprecated. This full code has been replaced by ISO8601's numeric offset notation. The lines of code to detect -00 or Z are practically identical, and a missing local time indicator is in any case a special case that has to be handeled. Forcing the programmer to add one extra check creates awareness! Automatic fall-through cludges like -00 just cause programmers to forget that this is a special case that might or might not need special consideration. For human viewers, I prefer a clearly visible distinction like Z and +00 instead of some hidden coding trick like -00 and +00 that is clearly outside the ISO 8601 format. We are discussing here about just one or two lines of code. If I have enough time, I'll write some nice generation and parsing routines for this format that we can put into the annex of the draft.
And while I'm here, it is all very nice to follow 8061 and all that, but if the aim of this draft is really to make a spec for reporting times that can be used on the internet, it is probably more important that the current internet time specs be examined, and needless differences be avoided. Eg: The rfc822 (e-mail) way to report a numeric time zone is +nnnn (or -nnnn) - no colons.
We go even one step further and remove the redundant minute offset digits. RFC822 has a relatively bad date/time format design. The original designers had only U.S. time zone abbreviations in mind, etc. I definitely wouldn't use it as any model for new designs. Including weekday indicators and 3-letter month names that have to be processed by a lookup table simply demonstrates that the RFC822 designers were careless here. RFC822 is anyway a very strange and difficult to read standard. The fact that it is widely used and quoted (or at least the subset of it that implementors understand ... ;-) does not mean that it is an excellent design. I was quite shocked when I read the famous RFC822 document the first time, and I have read a *lot* of bad standards before ...
There's about as much hope of that ever changing as there is of redefining time to use a much more rational 100 seconds in a minute, etc. Writing a spec that won't be used isn't very productive.
I do not agree here. We are not talking about changing RFC822. We are talking about an ASCII date/time format to be used in NEW protocol designs where there is no requirement for backward compatibility with RFC822. I claim that the RFC822 date format has been used in new protocols (e.g., HTTP), not because it is such a great design, but because it was easy to reference. Once a new and better format has been published, I am sure designers of new protocols will consider it. I had already plans to write something very similar to draft-newman-datetime and I very much appreciate this effort! Markus -- Markus G. Kuhn, Computer Science grad student, Purdue University, Indiana, USA -- email: kuhn@cs.purdue.edu
Date: Thu, 02 Jan 1997 21:46:47 -0500 From: kuhn@cs.purdue.edu ("Markus G. Kuhn") Message-ID: <199701030246.VAA09620@ector.cs.purdue.edu> The 'Z' is not part of any deprecated code, it is the official ISO 8601 designation for times given in UTC. I understand that, I was suggesting that it be deprecated, not that it is. There are many applications where you are not interested in local time, and there, the 'Z' is a nice and short indicator that we use UTC. I do not like the idea to add a useless and ugly -00 in protocols where only UTC will be used. But a simple 'Z' to remind the casual viewer that this is UTC won't do any harm. If *only* UTC will be used, I don't care, as the zone doesn't need to be examined of parsed in any case - I'd omit the thing entirely. It is unlikely that humans (casual viewers) will ever do much looking at protocols that carry only UTC, as human factors wouldn't allow a protocol that people will see to only carry UTC times, people want times they see to be in local time. You're also assuming that casual observers have some idea what Z might mean, which I find pretty unlikely. The lines of code to detect -00 or Z are practically identical, Yes, when you care to detect this case, they are. The difference is when you don't care, all you want is to get the time given converted (from whatever zone it was in) to either UTC or your own local time. There there is a difference, -00 can be parsed by atoi() (or whatever) just the same as +00 without any special cases at all. On the other hand, "Z" requires a magic special test for that case first. and a missing local time indicator is in any case a special case that has to be handeled. Not at all, as long as the time and zone together are consistent and after the two are combined, the correct UTC time can be deduced, nothing else often matters. The really hard case is the "this time is my local time, but I have no idea what the offset from UTC is". That's not what this discussion is about. We go even one step further and remove the redundant minute offset digits. The minutes offset is certainly not redundant. I find it hard to believe that anyone who knows anything about time zones can believe that. In Australia right now there are two different time zones that are not even hours. +1030 and +0930. There are two only because the former (further south) has summer time (it isn't daylight saving time in Aust, it is Summer Time) and the northern section (which gets close to the equator, and well into the tropics) doesn't. There are also half hour offsets in India, and other places. RFC822 has a relatively bad date/time format design. Actually, ignoring the alpha timezone nonsense (both the 3 letter versions, and the one letter ones) I think it is fairly reasonable. Including weekday indicators They are optional, harmless (you never need to parse them), and generally useful - I don't know about you, but right now I can't think just what day 29 Dec 1996 was, I know it was a few days ago, but just when it was, or what I did that day, I have no idea. But when I look at my calendar and see it was last Sunday, then I have a much better idea just what that was (it was the day I didn't get to go to the cricket because the game ended early, the previous afternoon....) You can argue that the UA should be calculating and displaying the weekday (of desired) based upon the rest of the date, but UA's of the time (and even now) were not nearly that complex, and tended to simply show the text as it was transmitted. and 3-letter month names that have to be processed by a lookup table You actually want to be thankful that they did it that way, if not, what we would have had to deal with in this message would have been 1/3/97 (or 1/3/1997) which I think you will agree with me means March 1 this year, but isn't what the people in the US believe it means... That might not need a lookup table, but it is ambiguous, and much much worse. You certainly wouldn't have got 19970103 as that's horrid for humans to parse, and unknown in the US (and generally here as well, if that matters). simply demonstrates that the RFC822 designers were careless here. I think you are wrong about that, this part I think they did as best that could be done - for that matter, so did they in most other areas. RFC822 is anyway a very strange and difficult to read standard. That's a different issue, and I mostly agree. It can be difficult to decipher. But once you do, you generally find that the actual spec is fairly reasonable (of course it helps if you understand the constraints that operated at the time - 822 wasn't written in a vacuum, 733 existed before it, etc). We are talking about an ASCII date/time format to be used in NEW protocol designs where there is no requirement for backward compatibility with RFC822. Fine. I claim that the RFC822 date format has been used in new protocols (e.g., HTTP), not because it is such a great design, but because it was easy to reference. I only partly agree. It has also been fairly widely used because it is generally better to have just one way of doing things than several. That's why I doubt that a new form will be widely used. Applications could simply use unix's "ctime" format, which would be much easier for many to generate - they don't, not because there is anything particularly wrong with that, not even because it is an OS specific format (it isn't, it was just invented there) but because it is different, and there is no compelling reason to do something different when a way to do the same function already exists. kre
In message <9275.852269194@munnari.OZ.AU>, Robert Elz wrote:
We go even one step further and remove the redundant minute offset digits
The minutes offset is certainly not redundant. I find it hard to believe that anyone who knows anything about time zones can believe that. In Australia right now there are two different time zones that are not even hours. +1030 and +0930.
I am very well aware of 30 min offsets. You probably have not read my full proposal (sorry, perhaps some parts of the discussion didn't go to tz). Here is the proposal again shown by examples (all show the same time): 1996-12-31 15:08:32-05 the common case 1997-01-01 01:38:32+05:30 very rare 30-min offsets 1996-12-31 20:08:32Z UTC 1996-12-31 15:08:32.048-05 higher precision for applications with 1996-12-31 20:08:32.05Z many timestamps per second 1997-01-01 01:38:32.048123456+05:30 worst case length: 35 characters I find these very nice, practical, and readable. For instance, GNU RCS 5.7 (with option -zLT, in the next revision hopefully by default) does it already right: $Id: tamper.html,v 1.8 1996-12-03 00:35:29-05 kuhn Rel $ So I am not inventing a new format here, but just suggest to use proven technology. Add "setenv RCSINIT -zLT" to your Unix startup scripts and enjoy the new date format with RCS 5.7 or higher. Another note: For maximum interoperability, the number of second fraction digits after the dot should be limited to 9 by the standard. This gives the nanosecond precision offered by the POSIX 1003.1b timer library functions, and I doubt anyone will ever need higher timing precision here. Just a guideline for dimensioning buffers and fgets() cutoff limits. About my comment on the RFC822 date format: Yes, I fully appreciate that they didn't use "5:12 p.m. 7/8/96", as I have seen it in another proprietary system recently. Things always could be worse ... ;-) Markus -- Markus G. Kuhn, Computer Science grad student, Purdue University, Indiana, USA -- email: kuhn@cs.purdue.edu
Date: Fri, 03 Jan 1997 15:34:20 -0500 From: kuhn@cs.purdue.edu ("Markus G. Kuhn") Message-ID: <199701032034.PAA03444@ector.cs.purdue.edu> I am very well aware of 30 min offsets. You probably have not read my full proposal No, it wasn't that, it was the wording of the phrase and remove the redundant minute offset digits which I read as "since the minute offset digits are redundant remove them", whereas you probably meant "and remove the minute offset digits when they are redundant". Personally, I don't like even the latter - I prefer fixed formats, without options, variations, etc - one simple string to parse where what comes next is always known, and doesn't need tests - or rather, where when the test fails it indicates a syntax error, rather than that someone just decided to omit that bit. kre
On Fri, 3 Jan 1997, Robert Elz wrote:
And while I'm here, it is all very nice to follow 8061 and all that, but if the aim of this draft is really to make a spec for reporting times that can be used on the internet, it is probably more important that the current internet time specs be examined, and needless differences be avoided. Eg: The rfc822 (e-mail) way to report a numeric time zone is +nnnn (or -nnnn) - no colons. There's about as much hope of that ever changing as there is of redefining time to use a much more rational 100 seconds in a minute, etc. Writing a spec that won't be used isn't very productive.
rfc 822 format, as amended by rfc 1123 is in wide use for email, and no one is suggesting changing that format (although DRUMs is working on making it more precise). My draft is more concerned with a format for new protocols, such as the vCard, vCalendar, WWW PICs and other proposals. Those are all bindly referencing 8601 and all it's ambiguity and complexity. 822+1123 is an old format with a number of compatibility problems due to the historical use of 2 digit years and non-numeric timezones. It's also extremely English-centric. In addition, there's the issue of free insertion of comments (which is clearly allowed, but probably doesn't work in some cases). There does seem to be a demand for a simpler, more international format for use in new protocols. My goal was to make a simple, precise and freely available Internet profile of 8601 and recommend it's use for new protocols. I am very strongly opposed to creating yet-another-format as there are already too many.
participants (5)
-
Chris Newman -
Garrett Wollman -
kuhn@cs.purdue.edu -
Paul Eggert -
Robert Elz