Re: Back-of-the-envelope cost of extra data :-)
Amending my previous email of 5/3/05 (where I said either method is OK), our preference is the algorithmic approach. Here are some comparisons (using the smaller 5-byte time storage proposed 5/9/05 by ado). The 400-year method requires in a larger timezone file (was 1K, now 6K). localtime() requires 3K/5K (32/64 bit) more memory per process for the state struct. There is more setup processing to initialize 400-years of the ats[] table. It requires about .8MB more hard disk space for the total zoneinfo directory. All processes encounter the overhead, but most processes would not make use of it. The performance for 64-bit transitions is probably better, as it is table-driven. The algorithmic approach requires a smaller timezone file than the 400-year approach. Some number of extra bytes are needed to store the variables to be used by the algorithm (and more space needed for a small 64-bit transition table, if one is used). There is a little setup for initialization of variables that would be used by the algorithm (and more setup for the small 64-bit transition table, if one is used). Additional hard disk space is minimal (or somewhat less than the 400-year file if a small 64-bit transition table is used). Performance for 64-bit transitions (past the 64-bit transition table if one is used) would probably not be as good as the 400-year cycle approach. We prefer the algorithmic method because it requires less overhead with disk space, process memory space, and setup processing time. We feel the possible slower performance for 64-bit transitions is acceptable since most applications won't use 64-bit transitions. Robbin
participants (1)
-
Robbin Kawabata