Question
At least Apache has a configuration option:
KeepAlive (see
Apache documentation for KeepAlive). When this is set like
KeepAlive On
in httpd.conf, TWiki takes forever to handle include files such as:
%INCLUDE{"http://localhost/twiki/pub/ost.txt"}%
Is this a known issue? I didn't find anything when looking for "KeepAlive" or "delay", except the similar
DelayOnInclude topic, that didn't get this answer anyway.
The workaround is to change
KeepAlive On
to
KeepAlive Off
in httpd.conf (usually found in /etc/ on unix systems - but locations do vary...) on the server
where the included files are.
KeepAlive is meant primarily as a performance booster, but TWiki doesnt work with it as far as I can see.
See
SlowPage where I've included a 5 byte file from a server running with
KeepAlive On
. I know twiki.org is a high-traffic server, but the
SlowPage contains %INCLUDE{"http://demo.capmon.dk/~pvm/ost.txt"}% a server I know is configured with
KeepAlive On
. On my server it takes 15+ seconds to load.
The reason for this can be found in TWiki::Net.pm (in sub getUrl), where this line 94 can be found:
while( <SOCK> ) { $result .= $_; }
Because the server doesn't close the socket (the purpose behind
KeepAlive) this little loop sits around waiting for the
KeepAlive timeout to occur (15 seconds in default apache httpd.conf configuration). So all files that have includes for files will take at least 15 seconds to display.
I guess the reason for implementing the HTTP protocol in TWiki instead of using e.g.
LWP is to get more lightweight components and therefore performance, but I just wanted to let you know that the getUrl algorithm is too simple for KeepAlive configured servers and seriously degrades performance. This little script taken from perldoc LWP handles
KeepAlive without timeouts. I am unsure especially about the $theHeader parameter to getURl, and so wont supply a patch:
use LWP::UserAgent;
$ua = new LWP::UserAgent;
$ua->agent("AgentName/0.1 " . $ua->agent);
# Create a request
my $req = new HTTP::Request GET => 'http://localhost/twiki/ost.txt';
# Pass request to the user agent and get a response back
my $res = $ua->request($req);
# Check the outcome of the response
if ($res->is_success) {
print $res->content;
} else {
print "Bad luck this time\n";
}
- TWiki version: 1Feb2003
- Perl version: 5.6.0 and 5.8.0
- Web server & version: Apache 1.3 and Apache 2.0
- Server OS: SuSE 7.2 and RedHat 8.0
- Web browser & version: IE6.0 and Mozilla 1.3 (not related to browser)
- Client OS: Windoze 2000
--
PeterMorch - 23 Mar 2003
We are using the TWiki version of December 2001, and I
think I am seeing
similar behavior there. To work around the issue, I went
to the Edit/Preferences/Advanced/HTTP Networking dialog in Mozilla 1.3 and
deselected the "Enable Keep-Alive" buttons there. That seems to fix the
issue, albeit at the price of slower performance, of course. We are using
the TWiki version of December 2001.
--
ClausBrod - 24 Mar 2003
- Keep-Alive is a problem because TWiki can't handle it when INCLUDEing sub-pages. It has nothing whatsoever to do with whatever browser you're using to access the resulting finished page. -- PeterMorch - 02 Apr 2003
Peter, what you're saying makes perfect sense to me, and yet it is obvious
here that the problem first occured after people at our site started moving
to Mozilla 1.3. I have never heard any complaints or seen any issues
related to Keep-Alive before that. Maybe there were changes in Mozilla's
Keep-Alive handling which have side effects on the TWiki code. Unfortunately,
the LWP stuff you described is pretty much over my head right now, or else I
would have already tried to add a patch and test it. My assumption is (correct
me if I'm wrong) that using LWP shouldn't be that much of a performance issue
when running TWiki under
ModPerl.
--
ClausBrod - 03 Apr 2003
Answer
Good point about using LWP - this should probably be provided as an option, although that's a fairly heavyweight package to include. It would be possible to only load LWP when doing includes of pages from remote servers, which would reduce the hit on most page accesses.
- Huh? My problem was exactly to access pages from localhost. And the localhost webserver had KeepAlive on. I guess the reason for only being able to include pages in the URL space is for security purposes (imagine
%INCLUDE{"file://etc/passwd"}%
would leave the system wide open.) Because of that, the KeepAlive problem is also real for files located on localhost if the local webserver happens to be running KeepAlive on. Nothing special about localhost... -- PeterMorch - 02 Apr 2003
ModPerl would probably be needed for good performance, which is not always easy to set up, even if you have root access (unusual on many intranet servers and most Internet hosts). I know one person who tried installing TWiki on
ModPerlWindows and gave up, installing Drupal instead for intranet use, so a simpler installation of
ModPerl and TWiki would be important.
--
RichardDonkin - 24 Mar 2003
I have a twiki on cygwin that has been driving me absolutely nuts trying to figure out why as soon as I enable certain skins the page load goes up to 60+ seconds. Now I know it's because I'm including 3 css files on every page.
UPDATE: Unfortunately turning keepalive off does not change the page load times for this particular deployment.
--
MattWilkie - 26 Mar, 03 April 2003
We were seeing this problem and fixed it thusly:
In Net.pm
my $req = "GET $theUrl HTTP/1.1\r\n";
changes to
my $req = "GET $theUrl HTTP/1.0\r\n";
This doesn't seem to affect peroformance, but makes %INCLUDE%s work again.
--
DavisWFrank - 16 Apr 2003
I just change the variable as suggested and it seems to work O.K. But I wonder if it is to spect a side effect
--
AntonioVega - 21 Apr 2003
This fix probably results in an incorrect mix of HTTP/1.0 and HTTP/1.1 header format. Compare the latest version
sub getUrl
in
CVS:lib/TWiki/Net.pm with an earlier version.
The reason for the switch to HTTP/1.1 is to allow an include of pages on virtual hosts and to support user authentication.
--
PeterThoeny - 22 Apr 2003