Question
I am facing 2 problems
1st problem is that the unusually large number of session file are generated in /tmp directory in a day although the number of visitors to our site is 3 or 4 a day.
what could be the reason for such large number session file generated in /tmp?
Following is a sample list of files generated in usr/tmp
-rw-r--r-- 1 apache apache 156 Jul 14 18:30 cgisess_ff657efa97a972843243bc09f6cc41c8
-rw-r--r-- 1 apache apache 156 Jul 14 20:33 cgisess_ff84add7b8a93819332a953e22768bce
2nd is that the error_log file show this error all time
[Thu Jul 13 02:01:46 2006] [error] [client 206.245.57.200]
AccessControlException: Access to edit
TWikiTipsOfTheDay for
TWikiGuest is denied. authentication required
there are no one access TWikiGuest . and the size of error_log get very large within a days
Anybody has any idea?
Environment
--
RaviGupta1 - 13 Jul 2006
Answer
If you answer a question - or someone answered one of your questions - please remember to edit the page and set the status to answered. The status selector is below the edit box.
On session files: You probably have a negative number in
{Sessions}{ExpireAfter}. See comment when you run
configure: "By default, session expiry is done "on the fly" by the same processes used to serve TWiki requests. As such it imposes a load on the server. When there are very large numbers of session files, this load can become significant. For best performance, you can set {Sessions}{ExpireAfter} to a negative number, which will mean that TWiki won't try to clean up expired sessions using CGI processes. Instead you should use a cron job to clean up expired sessions. The standard maintenance cron script tools/tick_twiki.pl includes this function."
On error message: The
TWikiGuest user is the user name for not authenticated users. I am not sure however why you get this message.
--
PeterThoeny - 15 Jul 2006
If it is a public site (on the internet) the chances are that the session files are created when spiders visit your site. Peter's advice is good for solving this. The second error message is probably because you have a link to an edit of the tips-of-the-day topic somewhere, but you haven't configured your
robots.txt to exclude the edit script (see google for help on robots.txt).
--
CrawfordCurrie - 12 Aug 2006