Re: [tz] Using MediaWiki / Wikidata
Tobias Conradi wrote:
was: Re: [tz] [PATCH 3/3] * europe (Europe/Vaduz): Now a link to Europe/Zurich.
On Tue, Sep 10, 2013 at 7:29 PM, Paul Eggert <eggert@cs.ucla.edu> wrote:
It may well make sense to organize what is currently tz commentary into something that is, well, a bit more organized. The format I'm most familiar with is MediaWiki, for what it's worth. And here is some infrastructure using MediaWiki: https://en.wikipedia.org/wiki/Category:Tz_database
E.g. https://en.wikipedia.org/wiki/Africa/Luanda https://en.wikipedia.org/wiki/America/Indiana/Indianapolis https://en.wikipedia.org/wiki/Asia/Novosibirsk
But there are vandalizing admins around, e.g. Anthony Bradbury: https://en.wikipedia.org/w/index.php?title=Special:Log&page=Time+in+Africa
Another option could be to use wikidata, e.g. Asia/Novosibirsk https://www.wikidata.org/wiki/Q4806295 and put evidences and for time offsets there.
Personally I have no time for wikipedia since we had a lot of well written material removed in the past because some other editors decided they did not like it. So I have no confidence that any material posted would remain clean or even available. I think that a 'private' wiki is much more appropriate given that what we are going to archive is evidential material? With publicly accessible sandboxes for providing new material. Which is then managed based on it's suitability for inclusion? I could live with mediawiki format, but I much prefer clean html documents using ckeditor. And adding interactive packages is something I don't find particularly easy with those type of systems anyway. Is there somewhere appropriate to host a service Paul? I'm happy to run something on my servers, but I suspect the load may be quite high ;) -- Lester Caine - G8HFL ----------------------------- Contact - http://lsces.co.uk/wiki/?page=contact L.S.Caine Electronic Services - http://lsces.co.uk EnquirySolve - http://enquirysolve.com/ Model Engineers Digital Workshop - http://medw.co.uk Rainbow Digital Media - http://rainbowdigitalmedia.co.uk
Your experimental github repo comes with a wiki. You can make changes in the same way you do code changes. It wouldn't be hugely difficult to render that into HTML to also be hosted at IANA. On 10 Sep 2013 22:55, "Paul Eggert" <eggert@cs.ucla.edu> wrote:
Lester Caine wrote:
Is there somewhere appropriate to host a service Paul?
If we're talking HTML, IANA can host it; that's what they're doing already, for tz-link.htm.
Kevin Lyda wrote:
Your experimental github repo comes with a wiki. You can make changes in the same way you do code changes.
It wouldn't be hugely difficult to render that into HTML to also be hosted at IANA.
I was waiting on someone saying that ;) The problem with the github wiki and the sourceforge and other code management site wiki's is that they don't allow the sort of control that is needed to maintain what will be essentially a reference document to the decisions on data in the tz database itself. DVCS is good at managing packages of material making up a software project, but documents need a fine grain control, which the history mechanism does partially address, but access to update needs a different access model. Personally I would still like to see a PAIR of DVCS repos, one with the tz data and a separate repo with the software. This will allow management of releases of the data along with a complete historic record which can then be used by alternate packages of software. In github terms, this repo could have an attached wiki that can be used to store submissions of evidence to the currently well know holes in the data. But the 'published' documents need to be a little more locked down? At this stage I'm just saying that using's Paul's github account is not the right venue for a production repository and moving forward is reliance on a third party system a good idea? Remember sourceforge? Why are we not using that nowadays. -- Lester Caine - G8HFL ----------------------------- Contact - http://lsces.co.uk/wiki/?page=contact L.S.Caine Electronic Services - http://lsces.co.uk EnquirySolve - http://enquirysolve.com/ Model Engineers Digital Workshop - http://medw.co.uk Rainbow Digital Media - http://rainbowdigitalmedia.co.uk
Sourceforge is still there. I clone my projects to it (and to github, bitbucket, google and gitorious). The joys of a DVCS. It's very simple to turn markdown into html and host that on IANA's page. Access control really isn't that hard when you have pull requests and the like. I do think Paul would want the ability to review all changes, but that doesn't preclude a tiered model where a few others might be trusted lieutenants whom n00bs like me might send changes to; which, once cleaned up, might then get sent to Paul. And doing it as a separate repo is cool too, though do note that github exports it's wiki as a repo as well. What might be interesting would be if IANA started using something like gitlab to host projects as its wiki system is similar to github's. As an aside, I have the following defined under [alias] in my ~/.gitconfig: dist = "!bash -c 'for r in $(git remote); do echo \"Pushing to $r\"; git push --all \"$r\"; git push --tags \"$r\"; done'" That allows me to push everything to every remote site I have defined with 'git dist'. So in Paul's case, he could push out his changes to a collection of code repos so end users could get tzinfo even if one or more are down (for those obsessed with availability). The wiki side is trickier as only some code hosting sites do wiki's as repos and even then have tighter restrictions than github on format (google uses it's own and I think bitbucket does as well). Kevin On Wed, Sep 11, 2013 at 8:19 AM, Lester Caine <lester@lsces.co.uk> wrote:
Kevin Lyda wrote:
Your experimental github repo comes with a wiki. You can make changes in the same way you do code changes.
It wouldn't be hugely difficult to render that into HTML to also be hosted at IANA.
I was waiting on someone saying that ;)
The problem with the github wiki and the sourceforge and other code management site wiki's is that they don't allow the sort of control that is needed to maintain what will be essentially a reference document to the decisions on data in the tz database itself. DVCS is good at managing packages of material making up a software project, but documents need a fine grain control, which the history mechanism does partially address, but access to update needs a different access model.
Personally I would still like to see a PAIR of DVCS repos, one with the tz data and a separate repo with the software. This will allow management of releases of the data along with a complete historic record which can then be used by alternate packages of software. In github terms, this repo could have an attached wiki that can be used to store submissions of evidence to the currently well know holes in the data. But the 'published' documents need to be a little more locked down?
At this stage I'm just saying that using's Paul's github account is not the right venue for a production repository and moving forward is reliance on a third party system a good idea? Remember sourceforge? Why are we not using that nowadays.
-- Lester Caine - G8HFL ----------------------------- Contact - http://lsces.co.uk/wiki/?page=contact L.S.Caine Electronic Services - http://lsces.co.uk EnquirySolve - http://enquirysolve.com/ Model Engineers Digital Workshop - http://medw.co.uk Rainbow Digital Media - http://rainbowdigitalmedia.co.uk
-- Kevin Lyda Galway, Ireland US Citizen overseas? We can vote. Register now: http://www.votefromabroad.org/
On 11 September 2013 08:19, Lester Caine <lester@lsces.co.uk> wrote:
The problem with the github wiki and the sourceforge and other code management site wiki's is that they don't allow the sort of control that is needed to maintain what will be essentially a reference document to the decisions on data in the tz database itself. DVCS is good at managing packages of material making up a software project, but documents need a fine grain control, which the history mechanism does partially address, but access to update needs a different access model.
The wiki may not have easily accessible history, but GitHub Pages do http://pages.github.com/ eg. https://github.com/JodaOrg/jodaorg.github.io and http://www.joda.org Stephen
On Thu, Sep 12, 2013 at 11:33 AM, Stephen Colebourne <scolebourne@joda.org> wrote:
The wiki may not have easily accessible history, but GitHub Pages do http://pages.github.com/
The problem with that is that it is hugely github specific. Yes, you can check that branch out elsewhere and you can display it, but you're generally locked into git (since it's a git branch, not a separate repo) and it's generally a very github-centric way of doing web pages. They're cute, but they're kind of too cute. Using the separate wiki repo instead works with other code-hosters and it's quite easy to use a static page generator to host them elsewhere. And it's also a little easier to understand: code/data in one repo; wiki in a second repo. People new to git are overwhelmed by branches in a single source tree - start telling them that you can actually have multiple trees in a repo and their eyes glaze over and they start twitching. Kevin -- Kevin Lyda Galway, Ireland US Citizen overseas? We can vote. Register now: http://www.votefromabroad.org/
participants (4)
-
Kevin Lyda -
Lester Caine -
Paul Eggert -
Stephen Colebourne