create new tag
, view all tags
It just occurred to me that the proposed MultiLevelWikiWebs support allows the concept of a Trash sub-web per web. This would eliminate many of the issues with the existing Trash web by inheriting protections from the parent web, and allowing much better namespace separation.

-- CrawfordCurrie - 30 Jun 2005

Yes, we could eliminate the "Select Web" dropdown from the current more page. The interface would be more distinct from the rename topic page (= less confusion). The action button would be "Move to trash".

-- ArthurClemens - 30 Jun 2005

So, Crawford, you are perhaps seeing that subwebs/hierarchical webs/etc is a desired and useful feature? I think I asked this a few years ago and was thoroughly chastised for even thinking this smile "Move toward the light" hehe! (friendly)

-- SteveRJones - 30 Jun 2005

No, I don't see them as a desireable feature. However a web-specific Trash receptacle I do see as desireable.

I will never join the dark side!

-- CrawfordCurrie - 30 Jun 2005

A good extra here would be to just add the button to the interface (or the "More topic actions" page) and create the Trash subweb during the first invocation if it doesn't already exist in the local installation.

-- PeterNixon - 30 Jun 2005

Crawford, you do not understand the power of multilevel - after all, Motorola is your father!!

Aack, sorry . . yes, a trash "bin" per web would be a very nice improvement - and wouldn't really break anything either. In fact, it is perhaps more intuitive to a casual TWiki user.

-- SteveRJones - 01 Jul 2005

And, you could dump an entire stale web (subwebs and all) into it. I've been using the notion of an Archive web in MegaTWiki for these purposes, but I suppose Trash would work also.

-- PeterNixon - 01 Jul 2005

Crawford, in one sentence or less could you at least enlighten us as to why multilevel webs is not a desireable feature? I'm not trying to argue - I just would like to understand the objection. I can imagine a few but would rather get your perspective.


-- SteveRJones - 01 Jul 2005

IMHO multilevel webs are too similar to the old directory structure of file systems. We can do better, can't we? smile

-- FranzJosefSilli - 01 Jul 2005

Franz - you are right - even single level webs are limiting as they are file level directories. but until we have a non-filesystem Store to play with, its quite likely that all ouw inspirations will come from there

-- SvenDowideit - 01 Jul 2005

Steve, see WhyWebsAreABadIdea.

-- CrawfordCurrie - 01 Jul 2005

Ah, so it might be stated that strongly tieing the organization of twiki webs to the underlying storage "motif" is inherently limiting - which is one reason why I liked Zope and it's underlying DB storage. Note that pure (as in 'snobbish') Unix sysadmin's HATE non-filesystem oriented storage - at least, this is the case of the one's that I have dealt with in the past. I attribute this to the fact that the filesystem is something they can count on as being very concrete, as opposed to a database where relationships are what matter.


-- SteveRJones - 01 Jul 2005

actually, when admining, the biggest worry i've had with databases, is that i lack confidence in their ability to do a real fsck, and to recover - while i expect my (unix at least) file system to work.

and in unix, i can do everything on filesystem - use find, ls, du, grep - rather than having to figure out what this database has decided to cann their sql client, and then wrk out what horrid authentiction system they use, and then to battle with their unique version of sql...

smile god i'm glad the GNU tool set has become mainstream smile

-- SvenDowideit - 01 Jul 2005

What can do caching for performance? Is this anywhere near database speeds?

-- ArthurClemens - 01 Jul 2005

Which is why one rarely finds a sysadmin and a DBA all rolled into one. The first day our Oracle DBA came to the sysadmins and asked for a partition, they created it as a ufs-formatted on. The horror when the DBA came back and said it had to be "raw"! Aack, an application managing disk-space? Inconceivable!

DB's are well tuned for caching as the data they are delivering is well defined and known to the system. This cannot be said for files in a filesystem. Also, the big limiter is also the application serving the data to the client. It may be able to pull the data off quickly but getting to the client is another story.

-- SteveRJones - 01 Jul 2005

one of the biggest reasons to have a TrashWebPerWeb is so that the trashed topics can retain the same security permissions as the original web. (well, there are other ways of dealing with this, too, but this approach seems to be the easiest way, by far) "deleting" topics has lead to "leaks" in more than one TWiki installation...

-- WillNorris - 02 Jul 2005

Edit | Attach | Watch | Print version | History: r17 < r16 < r15 < r14 < r13 | Backlinks | Raw View | Raw edit | More topic actions
Topic revision: r17 - 2005-07-02 - WillNorris
  • Learn about TWiki  
  • Download TWiki
This site is powered by the TWiki collaboration platform Powered by Perl Hosted by OICcam.com Ideas, requests, problems regarding TWiki? Send feedback. Ask community in the support forum.
Copyright © 1999-2018 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.