create new tag
, view all tags
See also CgiScripts

In the early versions of TWiki, the core functionality was held in these CGI scripts, hence TWiki was considered a FatClient. I and others propose thinning out the functionality held in the CGI scripts so that only CGI related stuff is done there. We suggest this because:

This means blacklisting some of TWikis CgiScripts that don't conform to this and reworking the script so that the functionality is stored in a DotPm file in the TWikiLib directory. To do this, find the CGI script from the list above (or make the topic if it does not exist) and add the keyword FatCgiScript in it. This will add it to the listing below:

Searched: FatCgiScript.*

Results from Codev web retrieved at 01:22 (GMT)

CategoryStale The following are the CgiScripts used by TWiki: % SEARCH{search `CgiScript` scope `topic` regexp `on` nosearch `on` nototal `on` header `` format `...
Module name changes ChangesCgiScript Location TWIKIROOT/bin/changes Summary Presents to user recent changes made to the current web Primary Author...
FatCgiScripts are CgiScripts that contain key functionality. They are the norm but are bad. See SlimDownFatCgiScripts for details. MartinCleaver 23 Jun 2002 CategoryStale...
Module name mailnotify MailnotifyCgiScript Location TWIKIROOT/bin/mailnotify Summary Produces periodic notifications based on whether people have added...
Module name manage ManageCgiScript Location TWIKIROOT/bin/manage Summary tools to manage the twiki using the twiki :) Primary Author PeterThoeny...
MegaTWiki Creature Feep at it`s finest, on the road to TWikiOO! After receiving feedback about my local implementation of TWiki at Sun, and reading up on the various...
MegaTWikiServiceRegistrationMethods At the moment, MegaTWiki implements a rudimentary service registration which `notifies` its service dispatch method which services...
Implemented: Move script functionality into TWiki::UI module Description The attached zip files contains a patch that removes the bulk of the functionality from most...
I`ve just started installing plugins on my site and was horrified to see the number of scripts being added to /bin at this rate with only a few plugins installed...
Module name rdiff RdiffCgiScript Location TWIKIROOT/bin/rdiff Summary To present a view of differences between revisions of topics Primary Author...
Refactoring FatCgiScripts into a module One of the things O`Wiki has done recently that I like a lot is a refactoring of most of the scripts in bin/ into subroutines...
Module name rename RenameCgiScript Location TWIKIROOT/bin/rename Summary Functionality to rename a topic Primary Author PeterThoeny CVS history...
Feature: Script to create a new TWiki web Creating a new web is currently a manual process as described in TWikiDocumentation. You need to create directories, set...
One detail of the ScriptToCreateNewWeb that made it into BeijingRelease is left: Create web does not copy file attachments. A clean way is to implement a TWiki::Store...
See also CgiScripts In the early versions of TWiki, the core functionality was held in these CGI scripts, hence TWiki was considered a FatClient. I and others propose...
See also Wikilearn.TWikiVariablesForTWikiNewbies . I have started to provide additional documentation for TWiki in a literate programming style using Leo (Literate...
Module name upload UploadCgiScript Location TWIKIROOT/bin/upload Summary Used to implement the uploading of attachments Primary Author NicholasLee...
Module name view ViewCgiScript Location TWIKIROOT/bin/view Summary Handles viewing of Topics Primary Author PeterThoeny CVS history http...
Number of topics: 18

-- MartinCleaver - 23 Jun 2002

Q: How do you detect FatCgiScripts?

A: Well, take the recently added manage script as an example, ScriptToCreateNewWeb. This would be a very simple slim-down effort, as the functionality that makes it fat is nicely held in four separate subroutines:

createEmptyWeb creates an empty web
copyWebTopics copies template files into newly created web
copyOneTopic used by above
  • patchWebPreferences

These probably need to be redistributed to the following modules:

PublicMethods createEmptyWeb
PrivateMethods copyWebTopics, copyOneTopic, patchWebPreferences (call with the bit to add)

PublicMethods addTextToWebPreferences

_However I won't pretend these things are simple. PrefsDotPm as it currently stands is only about retrieving run-time values of preferences. The addition that I have just proposed enlargens its functionality to include management of the preferences topic files.

Ah, such joy.

Code for the fat version I refer to is at http://cvs.sourceforge.net/cgi-bin/viewcvs.cgi/twiki/twiki/bin/manage?rev=1.3&content-type=text/vnd.viewcvs-markup

I currently have a "plugin" of sorts that helps with editing/saving web preferences from a cgi script, but the code has so many hooks into the TWiki core code that I don't want to release it for general consumption yet, unless folks are willing to take a gander at my whole TWiki implementation.

It's currently called MegaTWiki, because there's so much junk in it (maybe it should be called JunkyTWiki ;-), but it implements a bunch of these features, including the slimming down bit.

All the functionality in my plugin is accessed via a script called 'mega' which only calls a handler object which does all the work. Features are listed here.

I'm currently working on rolling these features into TWikiOO, but can make them work for both.

-- PeterNixon - 24 Jun 2002

Please see Support.TWikiOnSourceForgeSlowStatistics which discusses some timeout problems I've been experiencing. I'm hoping that if the fat CGI scripts are ever slimmed down, something can be done at the same time to address the timeout problems. (Note that I'm not sure we've come to a conclusion as to what causes those problems, but we have some hints.)

-- RandyKramer - 03 Jul 2002

The fatness will make little or no difference to the speed of the server. It only affects the elegance of the code structure and ability to use the code's functionality from other routes such as WebServices.

-- MartinCleaver - 05 Jul 2002

I understand that the fatness will make little or no difference to the speed of the server. I'd still like to encourage the developers to consider anything they can do to improve the situation. Just as a (probably incorrect) example, if RCS check-in during preview was part of the problem (I don't suspect that at this time), maybe something like caching the preview on the server side might help. (But maybe that would require cookies?) Anyway, I don't really know what I'm talking about, just trying to encourage that we consider ways to improve the situation.

-- RandyKramer - 05 Jul 2002

I think that the CgiScripts should only:

  1. work out what someone wants to do
  2. display the skin
  3. get the WikiML from the MiddleTier
  4. call WikiML::HTML to get the WikiML's rendering as HTML

Even authentication should be done by the MiddleTier

-- MartinCleaver - 13 Jul 2002

Agreed, though that seems a little too implementation specific (no package names yet, please, only concepts).

I'd rather put it this way:

CGI scripts should:

  1. work out what someone wants to do
  2. call the handler for that task
  3. let the handlers do the rest, using TWikiServices to do so.

Or even simpler (the MegaTWiki method):

CGI scripts should:

  1. Call the TWiki request handler, which works out what someone wants to do, and calls the appropriate handler.
  2. let the handlers do the rest, using TWikiServices to do so.

See ModuleLoadingPerformanceEnhancements for a way to speed up compile/load times of TWiki's perl code. This has no bearing on execution times (e.g. searches will not see much improvement here).

-- PeterNixon - 12 Jul 2002

Re the slowdown on twiki.org a few weeks back - this was just due to overloading of SourceForge servers IMO, as TWiki performs fine on other sites, e.g. my http://donkin.org/ site, even without ModPerl.

It would be well worth ensuring that the modularised TWiki works well with ModPerl, as this gives a huge speedup, typically 10 times faster. Should then also work with SpeedyCGI, which is probably more suited to intranet installations where the TWiki administrator doesn't have the root access typically required to add mod_perl to Apache.

-- RichardDonkin - 18 Jul 2002

Okay, Peter, so you would agree that displaying the skin and then calling the backend function is the only responsibility of the CgiScript?

-- MartinCleaver - 01 Sep 2002

I might go a little further and specify a two-tier approach by saying that the CgiScript's responsibility is only to call the backend function, which displays an appropriate skin for said function before calling it's subfunctions to fill the skin content.

-- PeterNixon - 26 Sep 2002

Okay, good, so we need to debate why I think invoking skin-displaying should be a function of the CGI script and why you think it should not.

(I need to rework this insert a bit; I am in an internet cafe and my time is almost up)

From Carnegie Mellon's Software Engineering Institute http://www.sei.cmu.edu/str/descriptions/twotier.html#512860: ...

Two tier architectures consist of three components distributed in two layers: client (requester of services) and server (provider of services). The three components are

  • User System Interface (such as session, text input, dialog, and display management services)
  • Processing Management (such as process development, process enactment, process monitoring, and process resource services)
  • Database Management (such as data and file services)

The two tier design allocates the user system interface exclusively to the client. It places database management on the server and splits the processing management between client and server, creating two layers.


Possible alternatives for two tier client server architectures are

  • the three-tier architecture (see Three Tier Software Architectures) if there is a requirement to accommodate greater than 100 users
  • distributed/collaborative architectures (see Distributed/Collaborative Enterprise Architectures) if there is a requirement to design on an enterprise-wide scale. An enterprise-wide design is comprised of numerous smaller systems or subsystems.


My main aim is to be able to fetch the content of a topic without having to take TWiki's look and feel with it. This would allow the same application-level routine to be callable from a different front-end, such as TWiki embedded in Mason or TWiki called from WebServices. The different front end would be responsible for displaying its look and feel yet call TWIki's "get the WikiML from the MiddleTier/render as HTML" routines.

The articles continue at Three Tier Software Architectures: http://www.sei.cmu.edu/str/descriptions/threetier.html


The three tier software architecture (a.k.a. three layer architectures) emerged in the 1990s to overcome the limitations of the two tier architecture (see Two Tier Software Architectures). The third tier (middle tier server) is between the user interface (client) and the data management (server) components. This middle tier provides process management where business logic and rules are executed and can accommodate hundreds of users (as compared to only 100 users with the two tier architecture) by providing functions such as queuing, application execution, and database staging. The three tier architecture is used when an effective distributed client/server design is needed that provides (when compared to the two tier) increased performance, flexibility, maintainability, reusability, and scalability, while hiding the complexity of distributed processing from the user. For detailed information on three tier architectures see Schussel and Eckerson. Schussel provides a graphical history of the evolution of client/server architectures.


I'll write more soon. M.

-- MartinCleaver - 27 Sep 2002

Hmmm... maybe I mis-used the phrase "two tier" there a bit; I'm not a programming history major. I also wasn't thinking towards that extreme of a vision. In Martin's vision, the content is completely disconnected from it's look and feel, which brings up the question, how does the skin know what to do with it?

If the content is an application (not just a topic), how does the skin know how to render the menu system or the forms therin, so they'll make sense to the end user? Certainly we can't place tight restrictions on where you can place a menu on a page (at the end would be silly; on the top might take up too much room; on the side might be better, who knows, and what is the client capable of?); It must be rendered in the context of the application, as the developer had intended. Therefore the skin rendering system must be aware of the application, the client, and the content being delivered. You can't just call up a skin and stuff some content into it in a brain-dead fashion unless the skinning system has some preconcieved notion of where to put it, and whether or not the skin is appropriate for the content.

If we're to continue on this particular path, we have to come up with some standard content types (ugh, another standard), and skin classes which use them.

We may need to extend WikiML, or just extend the use of the templating system to be integrated with the WikiML when applications are developed at the lower TWikiService level, where the page content is difficult to separate from the skin (maybe TWikiServices can generate dynamic skins). See the administration screens in MegaTWiki for an example of an application which requires special rendering, e.g. the screens are not contained within topic text; they are generated by MegaTWiki services).

My main point is that the request/service/WebService/TWikiService/Whatever may not be as easily separated from the skin as you may think.

So, maybe what the CGI script should be doing is this:

  1. Call function to get content and an appropriate skin type or class.
  2. Get the appropriate template (which matches the end user's client and the application class) and render it.
  3. Divide the content and stuff it into appropriate locations in the template.

-- PeterNixon - 27 Sep 2002

The other way to slim them down and also significantly speed up response is to create a dependingy (or call tree) and ensure that the functions are spread across the dot pee em files so that only those that are needed are loaded. What do I mean by OnlyLoadWhatsNeeded? Well look at the counter example: put all of the functions into TWiki.pm. Invoking any of them means the whole file is read in, parsed and compiled to intermediate code by the perl interpreter.

What we probably have now makes some unnecessry loads. Having the plug-ins is a good idea, it does isolate, but if the plugin dot pee em files are each called just for the initialization function ...

Somethink akin to the "autoloader" should be investigated.

Also, in the standard perl distribution, there is a tool called "autosplit" which splits a file into seperate modules, each with one function. This can be used with the autoloader to manage dynamic "on demand" load and compile.

I know from my days a "low level" programmer that C programmers have a tool that can take in the sources and libraries and show the call-dependency graph. This would be the guide to how the modules shuld best be partitioned for minimal (i.e. fastest, least amount parsed & compiled) loading. I don't know if such a tool exists for perl.

See also: CommonFrontEndCgiScript, ModuleLoadingPerformanceEnhancements, CachedResponses, HowShouldTWikiBeModularized, ModPerl and others on perfomance issues.

-- AntonAylward - 12 Jan 2003

I defer this to DakarRelease since it is not a critical feature holding up CairoRelease.

-- PeterThoeny - 19 Dec 2003

My view is that this was been mostly completed by Crawford in CairoRelease - so have marked it done.

-- MartinCleaver - 26 Sep 2004

Edit | Attach | Watch | Print version | History: r20 < r19 < r18 < r17 < r16 | Backlinks | Raw View | Raw edit | More topic actions
Topic revision: r20 - 2004-09-26 - MartinCleaver
  • Learn about TWiki  
  • Download TWiki
This site is powered by the TWiki collaboration platform Powered by Perl Hosted by OICcam.com Ideas, requests, problems regarding TWiki? Send feedback. Ask community in the support forum.
Copyright © 1999-2018 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.