Bug: Large attachments break TWiki
Test case
- Attach files to a page that completely fill the filesystem that contains both TWiki attachments (pub) and data files
- The contents of page X is completely lost
This is fairly bad - I think that page data should never be lost even if the attachments fill up the disk, preventing new pages from being written.
Ideally we could put a maximum file upload size on TWiki attachments - the huge uploads our site had were not in fact intended as a denial of service attack, but they had the same effect.
Fix record
Environment
TWiki version: |
March 15 2001 beta |
TWiki plugins: |
Interwiki |
Server OS: |
Linux 2.2 |
Web server: |
Apache 1.3 |
Perl version: |
5.6 |
Client OS: |
Win2000 |
Web Browser: |
IE5.0 |
--
RichardDonkin - 25 Aug 2001
I'm happy to take this on, as long as there is broad agreement as to an approach. Thoughts:
- Do as Richard suggests and have a maximum file size - this would be a parameter in
TWiki.cfg
- Consider a Plugin hook so fancy implementations are possible e.g.
- more than x% of free space
- keep track of individuals uploads for a period
Note that file is always uploaded to temporary area by the Web server, so some restrictions lie at that level.
--
JohnTalintyre - 03 Jan 2002
I would prefer something simple initially, unless discussion on this page proves that a more sophisticated model is required - so a simple global maximum attachment size would be fine IMO. Finding free space across Windows and Un*x is probably non-trivial, and the same applies to tracking individuals' file uploads.
However, what I'd really like to see is a fix for
BackFromPreviewLosesText, and inclusion of
RefreshEditPage in
BeijingRelease
--
RichardDonkin - 04 Jan 2002
TWiki supports now a
SizeLimitForFileAttachments
--
PeterThoeny - 26 Dec 2003