Downloading of documents and of segments of the WIKI for use offline
Hello all, I was recently asked a question by a participant on the ATRT2 whether it was possible to download a complete set of WIKI pages and their attachments for use/reading off-line, since many of us are looking at a 30+ hour journey to Durban. My response was that we could make a repository of all documents & group the documents into 1 zip file ready for an easy download. Any other idea from anyone here? Perhaps a special software? Kind regards, Olivier
I know that there are software to download complete web site for reading offline and this should normally work for a wiki site. If i remember one of these software I will let you know. Yaovi -----Original Message----- From: Olivier MJ Crepin-Leblond <ocl@gih.com> Sender: ttf-bounces@atlarge-lists.icann.org Date: Wed, 26 Jun 2013 12:18:03 To: ttf@atlarge-lists.icann.org<ttf@atlarge-lists.icann.org> Subject: [technology taskforce] Downloading of documents and of segments of the WIKI for use offline Hello all, I was recently asked a question by a participant on the ATRT2 whether it was possible to download a complete set of WIKI pages and their attachments for use/reading off-line, since many of us are looking at a 30+ hour journey to Durban. My response was that we could make a repository of all documents & group the documents into 1 zip file ready for an easy download. Any other idea from anyone here? Perhaps a special software? Kind regards, Olivier _______________________________________________ ttf mailing list ttf@atlarge-lists.icann.org https://mm.icann.org/mailman/listinfo/ttf
There are programs that will crawl a website for offline use such as http://www.httrack.com/ But the sheer number amount of links on a wiki page means (unless configured carefully to ignore the links pointing to other WGs and workspaces) makes it unwieldy to do Usually I manually export pages as PDFs (under "Tools" menu when logged in, there is an "Export to PDF" option, although the downside is that comments on the page aren't included in the PDF. If the wiki comments are important, then I print the pages to PDF. Space admins (such as At-Large Staff) have the option to export entire spaces or a subset of pages in that space as PDFs or as static HTML pages. See https://confluence.atlassian.com/display/CONF43/Exporting+Confluence+Pages+a... https://confluence.atlassian.com/display/CONF43/Exporting+Confluence+Pages+a... This seems to be the most straightforward method. Dev Anand On Wed, Jun 26, 2013 at 6:27 AM, Yaovi Atohoun <yaovito@yahoo.fr> wrote:
I know that there are software to download complete web site for reading offline and this should normally work for a wiki site. If i remember one of these software I will let you know.
Yaovi -----Original Message----- From: Olivier MJ Crepin-Leblond <ocl@gih.com> Sender: ttf-bounces@atlarge-lists.icann.org Date: Wed, 26 Jun 2013 12:18:03 To: ttf@atlarge-lists.icann.org<ttf@atlarge-lists.icann.org> Subject: [technology taskforce] Downloading of documents and of segments of the WIKI for use offline
Hello all,
I was recently asked a question by a participant on the ATRT2 whether it was possible to download a complete set of WIKI pages and their attachments for use/reading off-line, since many of us are looking at a 30+ hour journey to Durban. My response was that we could make a repository of all documents & group the documents into 1 zip file ready for an easy download. Any other idea from anyone here? Perhaps a special software?
Kind regards,
Olivier _______________________________________________ ttf mailing list ttf@atlarge-lists.icann.org https://mm.icann.org/mailman/listinfo/ttf
_______________________________________________ ttf mailing list ttf@atlarge-lists.icann.org https://mm.icann.org/mailman/listinfo/ttf
participants (3)
-
Dev Anand Teelucksingh -
Olivier MJ Crepin-Leblond -
Yaovi Atohoun