create new tag
, view all tags
for development discssion of the GenHTMLAddon plugin


-- CrawfordCurrie - 02 Feb 2003


New version (renamed to PublishAddOn) published 07 Jan 2003. -- EricScouten - 07 Jan 2003

New version 1.1 uploaded 3/1/02, bugs noted below all fixed and inclusions expressions added. -- CrawfordCurrie - 03 Jan 2002

Features Proposals

Offline use

I plan to use this to automatically "publish" some webs in a static form elsewhere at work. One way to do it would have a daily job scan the wiki and call genhtml on each web having a WebGenHtml topic. Options to the script would be put in this page: topics to exclude, style sheet (as attachment to the page?), pre/post processing scripts...

other ideas?

PS: I can trigger the script via curl, e.g.:
curl -s -S -F webname=$web -F action="Generate HTML" http://dumas.ilog.fr/wiki/bin/genhtml

-- ColasNahaboo - 13 Nov 2001

  • I would find configuration in this manner to be useful, as well: I'd like to be able to automate this with a cron task. -- SteveMokris - 04 Dec 2001

Yes, the module could be invoked from a cron. The way to do it would be to have a different main entry point but instead use a script that fills in a CGI object with the parameters and invokes processWeb direct. Something like this:

   my $query = new CGI("");
   # .... fill in $query with parameter values
   # .... there has to be enough there for twiki to initialise
   my $wif = GenHTMLIF->getInterface( $query );
   GenHTML::processWeb( $wif, $query );

-- CrawfordCurrie - 05 Dec 2001

META expansion

The META information does not seem to be expansed (parent...). it seems to me that it is a limitation of the plugin API, is it true?

-- ColasNahaboo - 13 Nov 2001

There are many limitations of the plugin API! But I don't exactly know why it's not being expanded; is it something to do with the template not including the META tags?

-- CrawfordCurrie - 05 Dec 2001

Include Topics Regexp

I'd like to see an option for specifying, by a regexp list, a set of topics to include in the html-generation, in addition to the exclude list. It could work something like this: the entire set of topics is matched to the "include" list (which is by default "*"), then the resultant list is antimatched to the "exclude" list.


-- SteveMokris - 04 Dec 2001

Done -- CrawfordCurrie - 03 Jan 2002

I'd love for you guys to make changes to the code and upload new versions; we don't have a CVS repository, but any updated code posted here as a complete package can be checked out by the others before putting it onto the main AddOn page.

Or you can post code fragments here and wait for me to update; pressure of work means that will not happen before Christmas.

-- CrawfordCurrie - 05 Dec 2001


-- CrawfordCurrie - 07 Oct 2001

Random Comments/Wishlist

I've just tested this with the December release, and to say the least, this is excellent. I've thought of a number of modifications I'd love to have made to this though, which I may well implement - specifically:

  • Publishing based on Category values (or more generally, search results)
  • Destination directory user configurable.
  • List of pages to Include as well as Exlcude. (seen comment above..)
  • Integration with htmldoc to produce PDFs of whole webs automatically, perhaps after declaring a page a TOC page.
  • Lazy publishing - if the existing published page is newer than the last editted version, don't re-publish. (This clashes with the current setup whereby non-existant Wiki links aren't linked though...)
  • Skins! (Or did I miss that?)

All in all though excellent .

-- MichaelSparks - 27 Dec 2001

Thanks Michael.

  • Category values; yes, good idea. I also thought of using a search expression to select topics to publish, but the search mechanism in TWiki is not easy to interface to.
  • Destination directory - I tried several alternatives for doing this; what I really wanted was to not have the output published on the server at all. What I'd really like would be for the generated docs to be zipped and the 'download file' dialog to appear to the user to allow local download of the pages, which should then be deleted from the server. But it's complicated....
  • Integration with htmldoc; good idea. We use htmldoc in our documentation flow but it's a hacked version. I haven't tried integrating with genhtml because PDF is falling out of favour as we try to get more docs on-line only.
  • Lazy publishing. How about using make?
  • Skins - yes, this should be done. And it should be quite easy.

After a bit of thought, skins don't seem to me to make sense; skins are just alternative templates. You have the optional genhtml publishing template (genhtml.tmpl), can't you use that instead?

-- CrawfordCurrie - 03 Jan 2002

Wow, a very good addon! But I have some wishes:

  • I want to generate HTML's for more than one web. These HTML's should be linked together. At the moment, the links to other webs are not correct.
  • The output directory should be pub/HTML/Web instead of pub/Web/HTML, so all webs would be in pub/HTML. Then it would be easier to extract the generated HTML's for all webs.
  • The images sub directory could be a problem, if two topics have the same file names for attachments. Perhaps the images should go into images/Topicname, a subdirectory for each topic.

-- StefanScherer - 04 Apr 2002

Bug Fixes

Allow dashes in file names

for version available on Nov 5:

in lib/TWiki/Plugins/GenHTMLAddon/GenHTML.pm

lines 31 and 40, the regexp should include a dash, otherwise image files with dashes in their name wont get copied:

    while ( $text =~ m/=\"$au\/([-a-zA-Z0-9_.\/]*)/ ) { 
      $text =~ s/=\"$au\/[-a-zA-Z0-9_.\/]*/=\"$href/; 

line 125, there was a mode missing in mkdir:

    mkdir "$pd", 0777;

This code seems great. I especially like the use of an external style sheet!

-- ColasNahaboo - 05 Nov 2001

Done -- CrawfordCurrie - 03 Jan 2002

CGI:dump error

for version Oct7, using perl v5.6.0 built for sun4-solaris, i was getting the following error:

Undefined subroutine CGI::dump
 at ../lib/TWiki/Plugins/GenHTML/GenHTML.pm line 173
changing the offending line from
print $cgi->dump();
print $cgi->as_string();
seems to have fixed the problem.

-- SteveMokris - 04 Dec 2001

CGI::dump is a part of the CGI module API; is your version up-to-date? Check on CPAN.

-- CrawfordCurrie - 05 Dec 2001

I had the same problem with GenHTMLAddon 1.1 using: perl v5.6.1 built for hppa-linux and CGI.pm CGI::VERSION='2.752' I think the problems arises, because dump() was renamed to Dump() The changelog at http://stein.cshl.org/WWW/CGI/ of CGI.pm states:

Version 2.50 .... Changed dump() to Dump() to avoid name clashes.

-- FrankHartmann - 14 Jan 2002

So what's the "official" version of CGI.pm to use with TWiki?

-- CrawfordCurrie - 15 Jan 2002

I think this is the same questions as which is the official perl verison, because CGI.pm seems to be in the standard perl package.

  • perl 5.005_03 comes with CGI 2.46
  • perl 5.6.1 comes with CGI 2.752

Best might be to be flexibel:

use CGI;

if ($CGI::VERSION < 2.5) {
  #print "old CGI\n";
  print $cgi->dump();

} else {
  #print "new CGI\n";
  print $cgi->Dump();
-- FrankHartmann - 17 Jan 2002

bugfix for internal links

We have some pages with handmade internal links in the form

    <A HREF="#somewhereelse">bla</A>

The GenHTMLAddon adds a .html into these links, but this can be fixed very in the lib GenHTML.pm. Just change the regular expression for the first catch.

Here is a diff:

<     $text =~ s/(<a href=\"[A-Z0-9_]*)(\#[^\"]*)?\"/$1.html$2\"/gio;
>     $text =~ s/(<a href=\"[A-Z0-9_]+)(\#[^\"]*)?\"/$1.html$2\"/gio;

-- StefanScherer - 03 Apr 2002

Edit | Attach | Watch | Print version | History: r18 < r17 < r16 < r15 < r14 | Backlinks | Raw View | Raw edit | More topic actions
Topic revision: r18 - 2003-02-02 - CrawfordCurrie
  • Learn about TWiki  
  • Download TWiki
This site is powered by the TWiki collaboration platform Powered by Perl Hosted by OICcam.com Ideas, requests, problems regarding TWiki? Send feedback. Ask community in the support forum.
Copyright © 1999-2018 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.