In message <Pine.SOL.3.95.970115174526.15280R-100000@eleanor.innosoft.com>, Chr is Newman wrote:
On Sat, 28 Dec 1996, Markus G. Kuhn wrote:
8) You could make the minute part of the time zone optional and require that time zones with an integral number of hours offset to UTC (which are almost all!) should only be represented by a 2-digit offset. That is what GNU RCS does already and I think having only a single colon increases human readability.
I've got a better argument for why this is a bad idea. Many people implement by example and only look at the standard when things break.
You argue that it is good standards writing practice to let unprofessional implementors get away with not reading the spec? If someone does not look at the standard, then he or she deserves not to be called a good software engineer. You can't make any protocol standard fully guessable by example. In contrast to ISO documents, RFCs are VERY widely read and VERY easily available for anyone who is even just slightly interested. You should expect the users of your standard to read the standard.
If we make the minutes optional for the offset, these people will probably never see minutes and their implementations will break in those countries which use minute offsets.
So they get punished with bug reports and they don't deserve it better! They probably will also not understand the -00:00 or Z and will forget to accept the fractional seconds. That's no way to develop any quality software system. I do not have very strong feelings about the optionality of the minute offset, but my arguments for it are: - only *very* few countries use it today and time zones are so often redefined that there is some good hope that the 30-min offset zones will disappear in 2020 or so. Then, we would not carry around any more the obsolete minute fields in our headers. - I personally think that the hour only offset is much more readable and that the additional implementation effort (one single "if" in my sample code) is so trivial that it is really not worth any discussion. Two points, where I have much stronger feelings about: The "T": - It simply looks ugly - ISO 8601 defines both a syntax for a date field and for a time field, and protocol designers do not violate the standard if they transmit a date and a time field separated by a space character, the normal field separator in many systems. The T was provided where for some reason (sortability, atomicity, etc.) you want to keep the date and time in one single field on database systems that separate fields with spaces. I hope that we can get an explanation for the "T" along these lines into the next revision of ISO 8601. The decimal comma: - The decimal dot is clearly THE decimal separator dominating in computer text representations of numbers. All ISO programming languages use the decimal dot exclusively and in ISO 6093:1985 "Information processing -- Representation of numerical values in character strings for information interchange", the decimal dot is the default decimal separator. I hope that we can convince the ISO 8601 authors to drop the preference for the decimal comma after reviewing other standards like ISO 6093 in the next revision. In addition, notations like 1970-01-01 00:00:00Z have become pretty common usage on the tz mailing list and in a number of USENET newsgroups that I read regularly. However, I have almost never seen the "T" anywhere used and my experience is that people who see the ISO 8601 standard the first time do not like it a lot. Another suggestion: It would be great if we could get the format that Paul Eggert and I suggested being mentioned as a named profile in the next revision of ISO 8601. Then the RFC would only have tutorial function (repeat the definition of the profile and provide some examples, example code, and usage guidelines). I'd say, it is time that we get involved in the work on the next revision of ISO 8601. Markus -- Markus G. Kuhn, Computer Science grad student, Purdue University, Indiana, USA -- email: kuhn@cs.purdue.edu