FW: New home for time zone stuff by 2012?

I'm forwarding this message from Mike Douglass, who is on the time zone mailing list now but was not when the message was sent. --ado -----Original Message----- From: Mike Douglass [mailto:douglm@rpi.edu] Sent: Thursday, August 27, 2009 9:56 To: tz@lecserver.nci.nih.gov Subject: RE: New home for time zone stuff by 2012? The Calendaring and Scheduling Consortium (CalConnect), through a number of activities over time, has demonstrated an active interest in addressing problems related to timezones for calendaring and scheduling systems for a while. The standards in this space, namely iCalendar, were developed by the Internet Engineering Task Force (IETF). A number of issues have spurred this work within CalConnect, including (but not limited to) the US EDST changes from 2007. As "consumers" of timezone data (rather than "producers" - which relates to the job done by the community represented by this mailing list, tz@elsie.nci.nih.gov) we are eager to see a reliable, timely and secure process for handling timezones. In CalConnect's Timezone Technical Committee (TC), we are presently developing a timezone service protocol that will allow for direct updates of client systems, rather than relying on the current process where systems typically get updated via OS upgrades, if at all. As part of this effort, we are also developing a generic timezone description format in XML so that interchange of timezone data can be done efficiently, and so that we can include structured meta-data like KML for boundary information. CalConnect would like to see a formal "standardization" of timezone names with a registry. This issue has been a problem in the iCalendar space where presently it is difficult to rely on a timezone definition with a given name, often resulting in interoperability problems. CalConnect would like to see timezone data passed "by reference" rather than "by value" for efficiency purposes (iCalendar requires that a VTIMEZONE component always be included in the iCalendar data stream when a timezone is referenced by an event). Earlier this year, CalConnect hosted a timezone workshop at one of its face-to-face Roundtables. The primary focus of the workshop was to discuss the problem statement and development of the protocol, data format and registry process. Since then we have also initiated discussions in the IETF on these topics. As "consumers" of timezone data CalConnect feels strongly about the need for these improvements. None of this necessarily impacts the process of "producing" timezone data as carried out by this list's community. Nonetheless, we care greatly about the "production" process because we have to rely on this data. We have informally discussed within the CalConnect Timezone TC what we would like to see for the future of the timezone data. We have not come to any firm conclusions as to the best way forward. Mr. Olson's email, therefore, comes as a timely reminder that this needs to be addressed now. Possible options (as already indicated by Mr. Olson) include: - Moving it to an "open source" location (such as SourceForge, which has been already suggested) - Setting up some kind of open consortium of interested parties to manage timezone data - Moving responsibility to an existing standards body (e.g., the IETF or the Internet Society - ISOC) - Moving responsibility to a government entity (e.g., the UN) Unfortunately, this debate can easily get mired in "politics" rather than technical issues. e.g., who gets to control the data, how is the service paid for, who gets to contribute. At the end of the day, CalConnect favors an approach which results in the least amount of disruption. The open community process developed via this list's community has clearly been a success, and should be considered as a potential model going forward. CalConnect considers tightening up of the security of the timezone data to be essential. Given that many systems rely on the data being produced, we collectively need a secure distribution (i.e. a secure, reliable server, signed data etc). Whilst there have not been any obvious "attacks" against timezone data, one cannot assume there won't be any in the future. This is a propitious time to achieve consensus on the best way to secure the data. This may very well impose additional requirements on hosting the data in the future, e.g., cost of maintaining the server, signing certificates etc). CalConnect looks forward to the discussions on this issue, and would like to hear the thoughts of other members of this community. CalConnect is ready to host another face-to-face timezone workshop, open to all interested parties, at our member meeting in February 2010. Mike Douglass douglm@rpi.edu Chair Timezone Technical Committee, on behalf of CalConnect

Some reactions in-line. paul
-----Original Message----- From: Mike Douglass [mailto:douglm@rpi.edu] Sent: Thursday, August 27, 2009 9:56 To: tz@lecserver.nci.nih.gov Subject: RE: New home for time zone stuff by 2012?
The Calendaring and Scheduling Consortium (CalConnect), through a number of activities over time, has demonstrated an active interest in addressing problems related to timezones for calendaring and scheduling systems for a while. The standards in this space, namely iCalendar, were developed by the Internet Engineering Task Force (IETF). A number of issues have spurred this work within CalConnect, including (but not limited to) the US EDST changes from 2007.
As "consumers" of timezone data (rather than "producers" - which relates to the job done by the community represented by this mailing list, tz@elsie.nci.nih.gov) we are eager to see a reliable, timely and secure process for handling timezones.
In CalConnect's Timezone Technical Committee (TC), we are presently developing a timezone service protocol that will allow for direct updates of client systems, rather than relying on the current process where systems typically get updated via OS upgrades, if at all. As part of this effort, we are also developing a generic timezone description format in XML so that interchange of timezone data can be done efficiently, and so that we can include structured meta-data like KML for boundary information.
A timezone service sounds like a good idea. It raises a bunch of questions, though. Security would be far more of an issue with such an approach as opposed to the current approach of relying on (and benefiting from) a secure OS distribution scheme. Scaleability could be a big problem. Consider every PC in the world polling such a server once a day for updates... Embedded systems and systems in closed networks still need the existing scheme, because they either can't or don't want to connect to a timezone service. XML is a nice and flexible interchange format. It usually isn't very efficient but probably efficient enough. It also is very bulky compared to the current format. Again, consider embedded systems. In a system I work on, we can't possibly store the current tzdata format in full, let alone what it would look like if expressed in XML. We solved that by modifying the tz compiler to omit any historic data prior to V1.0, since the nature of this product is that it never has to show times predating its release. With that change, the data fits (it shrinks down to less 200 kbytes, which is acceptable.
CalConnect would like to see a formal "standardization" of timezone names with a registry. This issue has been a problem in the iCalendar space where presently it is difficult to rely on a timezone definition with a given name, often resulting in interoperability problems. CalConnect would like to see timezone data passed "by reference" rather than "by value" for efficiency purposes (iCalendar requires that a VTIMEZONE component always be included in the iCalendar data stream when a timezone is referenced by an event).
I believe it has been generally recognized that standardized timezone names is a pipe dream. The abbreviated names are hopelessly ambiguous, and even ignoring that there isn't an authority standardizing them. The long names (like "America/New York") used in the tzdata are unambiguous. But they change occasionally as countries change city names. Then again, the tzdata contains links from the old to the new names, so the old names remain valid. That seems to be the best one could hope to achieve, and it's here now. What does "by reference" and "by value" mean in the context of timezone data? I can't figure out the meaning here.
Earlier this year, CalConnect hosted a timezone workshop at one of its face-to-face Roundtables. The primary focus of the workshop was to discuss the problem statement and development of the protocol, data format and registry process. Since then we have also initiated discussions in the IETF on these topics.
As "consumers" of timezone data CalConnect feels strongly about the need for these improvements. None of this necessarily impacts the process of "producing" timezone data as carried out by this list's community. Nonetheless, we care greatly about the "production" process because we have to rely on this data.
We have informally discussed within the CalConnect Timezone TC what we would like to see for the future of the timezone data. We have not come to any firm conclusions as to the best way forward. Mr. Olson's email, therefore, comes as a timely reminder that this needs to be addressed now.
Possible options (as already indicated by Mr. Olson) include:
- Moving it to an "open source" location (such as SourceForge, which has been already suggested) - Setting up some kind of open consortium of interested parties to manage timezone data - Moving responsibility to an existing standards body (e.g., the IETF or the Internet Society - ISOC) - Moving responsibility to a government entity (e.g., the UN)
Unfortunately, this debate can easily get mired in "politics" rather than technical issues. e.g., who gets to control the data, how is the service paid for, who gets to contribute.
Indeed. I believe an open process is critical. The current process is an example, but it depends in large part on the efforts of one person. A consortium formed for the purpose seems like it would work. The IETF is clearly also qualified to do it, and has a properly open process. The question is whether it would want to do it, or whether it would consider it out of charter. A government entity seems like a recipe for disaster.
At the end of the day, CalConnect favors an approach which results in the least amount of disruption. The open community process developed via this list's community has clearly been a success, and should be considered as a potential model going forward.
CalConnect considers tightening up of the security of the timezone data to be essential. Given that many systems rely on the data being produced, we collectively need a secure distribution (i.e. a secure, reliable server, signed data etc). Whilst there have not been any obvious "attacks" against timezone data, one cannot assume there won't be any in the future. This is a propitious time to achieve consensus on the best way to secure the data. This may very well impose additional requirements on hosting the data in the future, e.g., cost of maintaining the server, signing certificates etc).
I agree that signed data would be worth having. This need not add any cost; while SSL certificates may be expensive, PGP ones are free and widely used in the open source community. paul

<<On Thu, 27 Aug 2009 10:49:28 -0400, Paul Koning <Paul_Koning@Dell.com> said:
XML is a nice and flexible interchange format. It usually isn't very efficient but probably efficient enough. It also is very bulky compared to the current format.
It's also horribly unreadable. Although there is sometimes confusion about what the current tzdata format means, it's very easy to read, and the most common types of entries require little explanation to understand.
A consortium formed for the purpose seems like it would work. The IETF is clearly also qualified to do it, and has a properly open process. The question is whether it would want to do it, or whether it would consider it out of charter.
Likewise W3C. I'd actually be rather concerned about a formal consortium. I believe it is very important that any new organization going forward commits to keeping the fruits of the project in the public domain, and nearly any consortium is going to have a great deal getting its owners/members/lawyers to go along with that. Formal organizations also tend to develop very formalized processes, and (even worse) eventually become dominated by professional minders (representing their employer's interests) rather than by the sort of people you want maintaining a fundamental bit of infrastructure like this. (Viz., ICANN.) That would be the worst possible outcome.
A government entity seems like a recipe for disaster.
OK, second-worst. Having the UN, ISO/IEC, or a national government try to do this would be even worse. (And the ISO process, at least, has bad IPR issues and generally delegates the maintenance to a member body, which reduces to the previous case.) -GAWollman

Paul Koning writes
XML is a nice and flexible interchange format. It usually isn't very efficient but probably efficient enough. It also is very bulky compared to the current format. Again, consider embedded systems. In a system I work on, we can't possibly store the current tzdata format in full, let alone what it would look like if expressed in XML. We solved that by modifying the tz compiler to omit any historic data prior to V1.0, since the nature of this product is that it never has to show times predating its release. With that change, the data fits (it shrinks down to less 200 kbytes, which is acceptable.
Perhaps you missed the point. Your embedded system would not store the XML any more than it now stores the full text distribution with all comments. You now store a compiled form of the distribution text; in the future you would store a compiled form of the XML distribution. I think there are more plusses than minuses in going to XML. Particularly if an XML or RELAX NG schema is also distributed. It would probably be necessary to have a converter from XML back to the current format, but that should not be a problem. ++PLS CONFIDENTIALITY NOTICE: This message contains information which may be confidential or privileged. If you are not the intended recipient, be aware that any disclosure, copying, distribution or use of the contents of this information is prohibited. If you have received this transmission in error, please notify me immediately by telephone.

| From: Mike Douglass [mailto:douglm@rpi.edu] | Sent: Thursday, August 27, 2009 9:56 | To: tz@lecserver.nci.nih.gov | Subject: RE: New home for time zone stuff by 2012? | In CalConnect's Timezone Technical Committee (TC), we are presently | developing a timezone service protocol that will allow for direct updates | of client systems, rather than relying on the current process where | systems typically get updated via OS upgrades, if at all. That sounds fine, though (other than to the group of people with a direct reason to really want to stay up to date) I'm not sure it will have much impact upon the vast majority of nodes. | As part of this | effort, we are also developing a generic timezone description format in | XML so that interchange of timezone data can be done efficiently, and so | that we can include structured meta-data like KML for boundary information. That's also fine - just try to concentrate on your own needs, and define just what you need for your purposes - don't try and solve everyone else's problems at the same time. Allow others to develop their own mechanisms, and if needed, formats. They can start with what we have, what you produce, or some other available format - whatever best suits their needs. | CalConnect would like to see a formal "standardization" of timezone names | with a registry. We have standard names already, and a registry - the only real problem we have (aside from governments who won't allow almost anything to remain stable in this area) is that people keep wanting to disagree about the actual names. That's the least productive of everything that happens here, name choice is a hideously non technical area, where most of the arguments are derived from politics, religion, and vanity [*], rather than anything with any substantive rationale. If you want get into that, please do - just don't do it here, or (and I perhaps shouldn't speak for everyone else here, but I will) anywhere near the rest of us. | This issue has been a problem in the iCalendar space | where presently it is difficult to rely on a timezone definition with a | given name, often resulting in interoperability problems. I have no idea what names you're using (or if perhaps the problems are more related to versioning than naming), but if you just used the tzdata names then (versioning issues aside) you would have no more technical problems with the names (though you're going to have plenty of social problems). | Possible options (as already indicated by Mr. Olson) include: | | - Moving it to an "open source" location (such as SourceForge, which has | been already suggested) Personally, I'd hate to see this model. I suspect that one of the real reasons for the success and quality of the timezone code and data is precisely because this model has not been followed. The code and data is open source in the sense that anyone can grab it, and do whatever they like with it, but it is 100% closed in the sense that there's exactly one person who gets to actually make the changes. With the right person (which we've been lucky enough to have until now, or rather, probably until now plus the next couple of years or so) this works far better, faster, and more reliably, than any sorcefourge type solution, for this kind of (relatively small) project. | - Setting up some kind of open consortium of interested parties to manage | timezone data The data isn't open to debate (or shouldn't be), we don't need a consortium to manage it - what we could do with is better communication channels to and from the people who actually make the changes, so that we can record them correctly, and in a more timely fashion than we currently sometimes manage. | - Moving responsibility to an existing standards body (e.g., the IETF or | the Internet Society - ISOC) | - Moving responsibility to a government entity (e.g., the UN) I doubt either of those is needed. Remember we are not creating anything new, we're just attempting to document what others have done, or what (in some cases) it seems they are about to do. | Unfortunately, this debate can easily get mired in "politics" rather than | technical issues. e.g., who gets to control the data, how is the service | paid for, who gets to contribute. Yes ... but as long as we insist on the data remaining available to all (which is not hard, as anyone who would attempt to restrict it can simply be ignored, and someone else just continues from the last public version, plus updates), the control issue can be not so much of a problem. If we ever start worrying about money for this, then we have really failed. I can't speak for ado, obviously, but it is hard to imagine the process of looking after this stuff (perhaps the mailing list aside) consuming more than 15 or 20 minutes a week (on average, there are busy and quiet periods.) Since our "customers" are fairly limited, the traffic involved in distributing the data to those who need to get it directly from the source (rather than via an OS update, or similar, or whatever you are planning) should be, and remain, basically negligible. And anyone can contribute, of course - but all we want is authenticated facts (this is for the data which is what matters most, we're perhaps already just about reaching the point where maintaining the code is not so necessary any more - others can do that, in their formats for their own systems, we no longer really need to provide free code just to convince people that it is cost effective to use this data.) | At the end of the day, CalConnect favors an approach which results in the | least amount of disruption. The open community process developed via this | list's community has clearly been a success, and should be considered as | a potential model going forward. Yes, we (it seems) might need a change of czar, but ideally that's about all we need to be changing. | CalConnect considers tightening up of the security of the timezone data | to be essential. Given that many systems rely on the data being produced, | we collectively need a secure distribution (i.e. a secure, reliable | server, signed data etc). That seems reasonable, and fairly easy. | This may very well impose additional | requirements on hosting the data in the future, e.g., cost of | maintaining the server, signing certificates etc). I sincerely hope the server cost never becomes an issue. We don't need costly certificates for this - or rather, we don't need one of the ones that you would buy from a public certificate authority - those have their uses, this is not one. After all, the only things that are needed to make a CA, are trust, and stability - it needs to be trusted by all of its users, and stable enough not to require changes to its keys. For our purposes, we have (and should keep on having) relatively few consumers - all we need is for all of them to trust us (whoever is the czar) and the published public key. As long as everyone "knows" what that key is, it is essentially impossible for anyone to subvert (ignoring attacks that attempt to discover the private key). That's the same principle that the "big" CA's operate on, except that they need to deal with very large numbers of unknown consumers, so they have more work to do, ours is a simpler task. So, we just need a key pair, keep the private key secure, and widely publish (amongst the community that cares) the public key, use that key to sign a certificate, and use the key in that certificate to sign the date distributions (two levels as the actual key that's in use cannot be kept as secure - it needs to be available to sign updates, so is more vulnerable). Cost - about half an hour of someone's time to set up, and essentially nothing thereafter.) kre ps: an off-list message suggested a couple of possible people, one of whom was me, as possible successors to ado - while I'm flattered by the suggestion, and would be willing, I would not make a good choice, as we (you) would probably just have to repeat the process again far too soon, I am also not that far away from retiring age (sometime next decade at least I would expect.) [*]: politics - making decisions based upon what will most cause other people to prefer you, rather than some other guy religion - making decisions based upon no better justification than that someone said it was so, therefore it must be so, and is not debatable vanity - making decisions in order that you see yourself as having been more important / better / ... than everyone else

Selon Robert Elz <kre@munnari.OZ.AU>: ..
| Possible options (as already indicated by Mr. Olson) include: | | - Moving it to an "open source" location (such as SourceForge, which has | been already suggested)
Personally, I'd hate to see this model. I suspect that one of the real reasons for the success and quality of the timezone code and data is precisely because this model has not been followed. The code and data is open source in the sense that anyone can grab it, and do whatever they like with it, but it is 100% closed in the sense that there's exactly one person who gets to actually make the changes.
I will speak for what I know and use. The number of person(s) accredited to change the code has no link with the system used to distribute the code. You may have only one person with write rights on source and data trees and a different or same person allowed to upload a package (even it should be wise to have more than one admin for the project live). Actual delivery system broke any script used to build from source each time a new tzdata package is delivered (and previous is removed). This could be considered sometime as a feature but is painfull for others. One quality of sourceforge is that you never remove a package that has been made available. SF is low (time) cost, very low maintenance for the project admin. Gilles

At 07:16 27-08-2009, Olson, Arthur David (NIH/NCI) [E] wrote:
I'm forwarding this message from Mike Douglass, who is on the time zone mailing list now but was not when the message was sent.
--ado
-----Original Message----- From: Mike Douglass [mailto:douglm@rpi.edu] Sent: Thursday, August 27, 2009 9:56
[snip]
Possible options (as already indicated by Mr. Olson) include:
- Moving it to an "open source" location (such as SourceForge, which has been already suggested)
In terms of resources, the time zone stuff needs a mailing list and a way to distribute the database. It is better to have a party who can ensure the long term stability of the resources.
- Setting up some kind of open consortium of interested parties to manage timezone data
That brings some formality to the time zone stuff together with a set of new problems depending on how the open consortium works.
- Moving responsibility to an existing standards body (e.g., the IETF or the Internet Society - ISOC) - Moving responsibility to a government entity (e.g., the UN)
Unfortunately, this debate can easily get mired in "politics" rather than technical issues. e.g., who gets to control the data, how is the service paid for, who gets to contribute.
Yes.
At the end of the day, CalConnect favors an approach which results in the least amount of disruption. The open community process developed via this list's community has clearly been a success, and should be considered as a potential model going forward.
Absolutely.
CalConnect considers tightening up of the security of the timezone data to be essential. Given that many systems rely on the data being produced, we collectively need a secure distribution (i.e. a secure, reliable server, signed data etc). Whilst there have not been any obvious "attacks" against timezone data, one cannot assume there won't be any in the future. This is a propitious time to achieve consensus on the best way to secure the data. This may very well impose additional requirements on hosting the data in the future, e.g., cost of maintaining the server, signing certificates etc).
The problem with security is that it is at odds with the "open model". If you get into signing certificates, you have to determine who signs the data. You invite "attacks" by with a "central" model. It's better to leave it to the parties which interact with the "consumers" to determine how they want to secure the time zone data they provide. At 08:43 27-08-2009, Robert Elz wrote:
That's also fine - just try to concentrate on your own needs, and define just what you need for your purposes - don't try and solve everyone else's problems at the same time. Allow others to develop their own mechanisms, and if needed, formats. They can start with what we have, what you produce, or some other available format - whatever best suits their needs.
Yes.
Personally, I'd hate to see this model. I suspect that one of the real reasons for the success and quality of the timezone code and data is precisely because this model has not been followed. The code and data is open source in the sense that anyone can grab it, and do whatever they like with it, but it is 100% closed in the sense that there's exactly one person who gets to actually make the changes.
The code and data goes beyond open source. Anyone can grab it and do whatever they like with it; they can even change the names.
With the right person (which we've been lucky enough to have until now, or rather, probably until now plus the next couple of years or so) this works far better, faster, and more reliably, than any sorcefourge type solution, for this kind of (relatively small) project.
Agreed. Regards, -sm

Excerpt of message (sent 27 August 2009) by SM:
...
CalConnect considers tightening up of the security of the timezone data to be essential. Given that many systems rely on the data being produced, we collectively need a secure distribution (i.e. a secure, reliable server, signed data etc). Whilst there have not been any obvious "attacks" against timezone data, one cannot assume there won't be any in the future. This is a propitious time to achieve consensus on the best way to secure the data. This may very well impose additional requirements on hosting the data in the future, e.g., cost of maintaining the server, signing certificates etc).
The problem with security is that it is at odds with the "open model". If you get into signing certificates, you have to determine who signs the data. You invite "attacks" by with a "central" model.
I'm not sure that's a real problem. Many open source projects have signed releases. All it means is that whoever volunteers to do the actual distribution (packaging up of the tarball, putting it on the distribution sites) has a signing key and uses it to sign the tarball. It doesn't prevent others from distributing their own, signed or not. It merely means that there exists at least one distribution that has a signature on it.
At 08:43 27-08-2009, Robert Elz wrote:
... The code and data is open source in the sense that anyone can grab it, and do whatever they like with it, but it is 100% closed in the sense that there's exactly one person who gets to actually make the changes.
The code and data goes beyond open source. Anyone can grab it and do whatever they like with it; they can even change the names.
That's true for a lot of open source, too. Don't confuse open source with GPL. GPL is one specific example, and more restrictive than most of the other flavors.
With the right person (which we've been lucky enough to have until now, or rather, probably until now plus the next couple of years or so) this works far better, faster, and more reliably, than any sorcefourge type solution, for this kind of (relatively small) project.
Agreed.
True, it's a pretty small project. Then again, a lot of sourceforge based projects only have one or two developers, too. I think in the final analysis sourceforge is nothing more than a well known supplier of mailing list and file server services. If obtaining those services is an issue, they are one possible solution. If whoever ends up volunteering to be the new lead is in a position of providing space and list services directly -- as ADO has done -- then that works fine too, everything is self-contained. The efficiency and speed you mentioned comes from the size of the team and the specific personalities in it. The provider of the infrastructure doesn't seem to enter into it. paul

Dear all, The TZ database and associated code represent a remarkable and yet often unsung success that people who have contributed to should be very proud of. Apart from the work done to release databases and code, the work done gathering the information is critical to the success of any effort. And what's *really* remarkable is just how *little* fragmentation has occurred when you consider that all of this stuff is public domain. We ought to do our best to perpetuate the existing model as best we can. My experience with standards organizations is that they come with a lot of process baggage. Before we engage them, I would first want to know if this group could easily identify a senior contributor who would willing to step up to the task (I'll just note that I am not senior, nor am I a contributor, really). Disk space and hosting are cheap. This group having a plan BEFORE engaging the standards bodies will ensure that we get what we want. Eliot On 8/27/09 4:16 PM, Olson, Arthur David (NIH/NCI) [E] wrote:
I'm forwarding this message from Mike Douglass, who is on the time zone mailing list now but was not when the message was sent.
--ado
-----Original Message----- From: Mike Douglass [mailto:douglm@rpi.edu] Sent: Thursday, August 27, 2009 9:56 To: tz@lecserver.nci.nih.gov Subject: RE: New home for time zone stuff by 2012?
The Calendaring and Scheduling Consortium (CalConnect), through a number of activities over time, has demonstrated an active interest in addressing problems related to timezones for calendaring and scheduling systems for a while. The standards in this space, namely iCalendar, were developed by the Internet Engineering Task Force (IETF). A number of issues have spurred this work within CalConnect, including (but not limited to) the US EDST changes from 2007.
As "consumers" of timezone data (rather than "producers" - which relates to the job done by the community represented by this mailing list, tz@elsie.nci.nih.gov) we are eager to see a reliable, timely and secure process for handling timezones.
In CalConnect's Timezone Technical Committee (TC), we are presently developing a timezone service protocol that will allow for direct updates of client systems, rather than relying on the current process where systems typically get updated via OS upgrades, if at all. As part of this effort, we are also developing a generic timezone description format in XML so that interchange of timezone data can be done efficiently, and so that we can include structured meta-data like KML for boundary information.
CalConnect would like to see a formal "standardization" of timezone names with a registry. This issue has been a problem in the iCalendar space where presently it is difficult to rely on a timezone definition with a given name, often resulting in interoperability problems. CalConnect would like to see timezone data passed "by reference" rather than "by value" for efficiency purposes (iCalendar requires that a VTIMEZONE component always be included in the iCalendar data stream when a timezone is referenced by an event).
Earlier this year, CalConnect hosted a timezone workshop at one of its face-to-face Roundtables. The primary focus of the workshop was to discuss the problem statement and development of the protocol, data format and registry process. Since then we have also initiated discussions in the IETF on these topics.
As "consumers" of timezone data CalConnect feels strongly about the need for these improvements. None of this necessarily impacts the process of "producing" timezone data as carried out by this list's community. Nonetheless, we care greatly about the "production" process because we have to rely on this data.
We have informally discussed within the CalConnect Timezone TC what we would like to see for the future of the timezone data. We have not come to any firm conclusions as to the best way forward. Mr. Olson's email, therefore, comes as a timely reminder that this needs to be addressed now.
Possible options (as already indicated by Mr. Olson) include:
- Moving it to an "open source" location (such as SourceForge, which has been already suggested) - Setting up some kind of open consortium of interested parties to manage timezone data - Moving responsibility to an existing standards body (e.g., the IETF or the Internet Society - ISOC) - Moving responsibility to a government entity (e.g., the UN)
Unfortunately, this debate can easily get mired in "politics" rather than technical issues. e.g., who gets to control the data, how is the service paid for, who gets to contribute.
At the end of the day, CalConnect favors an approach which results in the least amount of disruption. The open community process developed via this list's community has clearly been a success, and should be considered as a potential model going forward.
CalConnect considers tightening up of the security of the timezone data to be essential. Given that many systems rely on the data being produced, we collectively need a secure distribution (i.e. a secure, reliable server, signed data etc). Whilst there have not been any obvious "attacks" against timezone data, one cannot assume there won't be any in the future. This is a propitious time to achieve consensus on the best way to secure the data. This may very well impose additional requirements on hosting the data in the future, e.g., cost of maintaining the server, signing certificates etc).
CalConnect looks forward to the discussions on this issue, and would like to hear the thoughts of other members of this community. CalConnect is ready to host another face-to-face timezone workshop, open to all interested parties, at our member meeting in February 2010.
Mike Douglass douglm@rpi.edu Chair Timezone Technical Committee, on behalf of CalConnect
participants (8)
-
Eliot Lear
-
Garrett Wollman
-
Gilles Espinasse
-
Olson, Arthur David (NIH/NCI) [E]
-
Paul Koning
-
Paul Schauble
-
Robert Elz
-
SM