create new tag
, view all tags


When using twiki/bin/view to view a topic, if the topic has tables with a large number of rows, view hangs and eventually times out when it hits Apache's limit (600 seconds). This seems to happen with tables that have 10,000+ rows.

Edit to add: the version of TWiki we used previously, Cairo (01 Sep 2004 $Rev: 1742 $), was able to render these tables relatively quickly (within 10 seconds or so). This problem only started occurring when we upgraded to version 4.1.1.

Are there any fixes available for this problem?


TWiki version: TWikiRelease04x01x01
TWiki plugins: SpreadSheetPlugin, ActionTrackerPlugin, BatchUploadPlugin, CalendarPlugin, ChartPlugin, CommentPlugin, DBIQueryPlugin, DatabasePlugin, EditTablePlugin, EditTablerowPlugin, FileListPlugin, HeadlinesPlugin, IfDefinedPlugin, InterwikiPlugin, LocalCityTimePlugin, OoProjectPlannerPlugin, PreferencesPlugin, RenderListPlugin, SlideShowPlugin, SmiliesPlugin, TWikiDrawPlugin, TablePlugin, TwistyPlugin, WysiwygPlugin
Server OS: Linux 2.4.21-32.elsmp (i686-linux-thread-multi-64int)
Web server: httpd-2.0.52-12.ent (Apache)
Perl version: 5.008008 (linux)
Client OS:  
Web Browser:  
Categories: Fatal error

-- BenEsacove - 17 Jan 2008


ALERT! If you answer a question - or someone answered one of your questions - please remember to edit the page and set the status to answered. The status selector is below the edit box.

I just tested this. It seems that expansing tables in TML syntax (|...|) consumes now a lot of CPU and memory. For my test with 20,000 lines tables I used this to generate them:

i=20000;while let 'i-->0';do echo "| $i | $RANDOM  | $RANDOM | $RANDOM | $RANDOM | $RANDOM | $RANDOM | $RANDOM | $RANDOM |";done >TableTml.txt

i=20000;( echo '<table>';while let 'i-->0';do echo "<tr><td> $i <td> $RANDOM <td> $RANDOM <td> $RANDOM <td> $RANDOM <td> $RANDOM <td> $RANDOM <td> $RANDOM <td> $RANDOM";done;echo "</table>") >TableHtml.txt 

generating the topic (timed wget) uses up on the server:

  • HTML: 18 seconds, 65 megabyte ram
  • TML: 121 seconds, 465 megabyte ram
Then the rendering in the browser seem fast enough (some seconds). Thus, the huge memory consumed by the table TML generation is the problem: it is slower, most probably induce swapping, and will be devastating if you use mod_perl or other perl accelerators as il will have bloated your perl interpreter.

Thus, The only advice we can offer you as a short-term solution is to generate your huge tables in HTML format. I sincerely hope they are machine-generated tables, I cannot imagine you hand-edit 10,000 rows tables in wikis... And thus you may even want to generate these pages as html pages on a static server and link to them.

As a long-term solution, well, maybe slowness on huge tables is the price to pay for added functionality of small-size tables, which is the intended use case of wikis, so there may be no hope to speed the huge table case...

-- ColasNahaboo - 18 Jan 2008

Wow, this table size is extreme ... which makes a good performance test case. However, are you facing this table size in real life also? I think there are better ways to store that amount of data.

Anyway, you are using some plugins that parse the complete topic text on directly, bypassing the TWiki parser. Try to disable them just to narrow down, which of these is the resource hog, given it is not the core itself.

-- MichaelDaum - 18 Jan 2008

First of all, try with TablePlugin disabled. That plugin does a lot of parsing.

-- ArthurClemens - 18 Jan 2008

Maintaining a static of that size is probably not feasible even if the performance is right. However, there is a use case: A SEARCH could potentially return a very big table. Hence it is worth while investigating the TML table performance, especially why we have the drop between TWiki 01 Sep 2004 and 4.1.1.

-- PeterThoeny - 19 Jan 2008

I filed TWikibug:Item5268 to follow up on this.

-- PeterThoeny - 19 Jan 2008

Even a SEARCH should not return a table of that size. You will need to implement some kind of pagination or refine your search query to narrow down results. All "systems" along the way will suffer from large tables: the twiki server that needs to compute it, network bandwidth, the browser needing to suck in all that html, and last not least the user being overwhelmed by that amount of info. So this all only takes a long time to compute and transmit and has questionable value for the user.

-- MichaelDaum - 21 Jan 2008

I just wanted to say thanks for the suggestions. We're still deciding what we want to do about the tables. Using HTML to generate new tables will work OK, but there is existing data that may not work well for.

-- BenEsacove - 29 Jan 2008

Change status to:

Topic attachments
I Attachment History Action Size Date Who Comment
HTMLhtm configure.htm r1 manage 173.4 K 2008-01-17 - 19:52 BenEsacove configure settings for our server
Edit | Attach | Watch | Print version | History: r7 < r6 < r5 < r4 < r3 | Backlinks | Raw View | Raw edit | More topic actions
Topic revision: r7 - 2008-01-29 - BenEsacove
  • Learn about TWiki  
  • Download TWiki
This site is powered by the TWiki collaboration platform Powered by Perl Hosted by OICcam.com Ideas, requests, problems regarding TWiki? Send feedback. Ask community in the support forum.
Copyright © 1999-2018 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.