Tags:
archive_me1Add my vote for this tag create new tag
, view all tags
"Houston, we have got a problem"

Dear CodevCommunity,

Let me summarize the current situation and recommended actions to take. We count on you, jump on the train to the TWikiMission, don't miss it smile

Past and Present

  • Past:
    • A well functioning CoreTeam
    • Contributions primarily by the core team
    • Relatively quick additions of features
    • Yearly or half-yearly production releases

  • Currently:
    • A well functioning core team, but with some inactive members due to other constraints/priorities
    • Contributions primarily by the CodevCommunity
    • Relatively slow acceptance of features into core code
    • Yearly production releases

Characteristics of the CodevCommunity

  • Things that work:
    • We enjoy a very high signal-to-noise ratio at TWiki.org compared to other OpenSource communities. This is attributed to the strategy of attracting new members based on a natural screening-process. This invites motivated folks; motivation is the driving force behind OpenSource projects.
    • Good documentation for new folks to get up-to-speed on how we work together, see ReadmeFirst and PatchGuidelines.
    • Evolving process for tracking bugs and features.
    • Overall, a friendly and courteous environment, inviting folks to come back (e.g. a "high stickiness" of the Codev web.)

  • Things that don't work: Functionality timeline up to now
    • Number of active CoreTeam members; core team not allocating enough resources to get contributions accepted into TWiki.
    • Process managing the release in progress (currently CairoRelease.)
    • We have exceptions to the friendly and courteous environment, a force working against the community. In particular, PropagandaWithFearAppeal has played a role in presenting a darker picture of the situation than necessary.
    • Slow advancement of core functionality. Overall however, TWiki has a much faster advancement if you take the Plugins into account. The graph on the right illustrates this.

Many members of the CodevCommunity bring a lot of experience, and with that, bring sound contributions. We also have domain experts on board; to name just one, ArthurClemens is a usability expert who contributed very good UsabilityIdeas in various places.

Possible Solutions

  1. Fork: A fork sounds like an obvious choice: Fork, take as many developers along as possible, define new processes and start churning out features in a rapid pace (at the cost of quality/stability?).
  2. Architectural changes: Change the architecture of TWiki to facilitate a rapid evolvement of TWiki, aligned with the TWikiMission, without compromising the stability and quality.
  3. Process changes: Enhance the process in the Codev web to facilitate a rapid evolvement of TWiki, aligned with the TWikiMission

1. Fork: A fork is certainly a possibility, but there is a high price to pay - TWiki loosing ground to other Wikis because of split resources (people) and code merging efforts. Multiply that by x for x number of forks. Whatever happened to the WiiskiWiki, a TWiki fork? Now dead. What happens to the many *BSD flavors? Loosing ground to Linux due to fractionated communities and dev effort. What happens to PHPNuke and PostNuke? The list can go on and on. This is clearly not in the interest of the CoreTeam and hopefully not of the majority of the CodevCommunity members.

2. Architectural changes: The evolvement of Plugins is faster then the core code; for some good reasons as mentioned in PleaseEmpowerYourContributors. This is not necessary a bad thing. It is much easier to scale Plugins development then the core, the same way as it is easier to develop a standalone application on top of an OS then it is to enhance the kernel with high quality. So, it makes sense to make architectural changes with these high level goals:

  • Modularize the core code
  • Move some of the core into Plugins
  • Facilitate further the development of Plugins and Skins
The current code base cannot serve as a model for a nice architecture, but we can get a lot of mileage out of it by changing the code base evolutionary. With those changes it will be easier to embrace new technologies like WebServices.

3. Process changes: This is clearly the area where we can get the best mileage for a faster advancement of the TWiki platform. Simply bumping up the number of core team members does not work. Accepting patches is currently very time consuming because we have to test each patch manually to make sure that it does not break anything. In particular, we will:

  • Define clear roles and responsibilities for the core team members
  • Get new Core team members on board with clear roles
  • Restructure the process to facilitate the release in progress (CairoRelease)
  • Define and implement a test framework that allows a rapid inclusion of new features

Hopping on the Train (Recommended Actions)

The CoreTeam hopes to get your support. I will do my best and will execute to whatever is necessary to align our community. Pulling on the same string strengthens the CodevCommunity, working against it weakens the community, it is as simple as that.

Actions for the CodevCommunity:

  1. Produce patches that can be integrated easily (tested, high quality, aligned)
  2. Work on the features identified in CairoRelease
  3. Review ReadmeFirst and the PatchGuidelines and provide constructive feedback

These are short term actions. Follow up actions will be communicated once the process changes have been fleshed out.

Actions for the CoreTeam:

  1. Define clear roles and responsibilities (we are working on it behind the scenes since last month; expect a first cut within a week)
  2. Get new Core team members on board with clear roles
  3. Officially roll out CoreTeamHallOfFame

With these steps in place we can start with the actual work, with the help of a larger core team, which is:

Functionality timeline going forward

  • Restructure the WebForm to facilitate the release in progress (CairoRelease)
  • Define and implement a test framework that allows a rapid inclusion of new features
  • Include many good patches waiting to be accepted
  • ... many more to come

With this in place we will be able to accelerate the advancement of the core code again. The graph on the right illustrates this.

Wikis are still in a niche market. It is only a matter of time until Wikis (and TWiki) will take off, see yesterday's CNN article Wikipedia: The know-it-all Web site.

Thanks for listening, stay tuned for interesting times ahead,
PeterThoeny and the CoreTeam.

-- PeterThoeny - 05 Aug 2003



Feedback


...I'm looking forward to the mentioned "first cut"

possible solutions: there is no mention of branching which is an intermediate step between what we have now and a full fledged fork. E.g. there could be a stable and unstable branches as is common with open source projects. (is it possible to have write access to one cvs branch and not another?)

recommended actions, codev community: the outline looks good, but how are the steps to be accomplished, e.g. what do I do? What does "patches must be aligned" mean?

recommended actions, core team: code contributions need feedback from the core team within X weeks (suggest 4), "You're patch for Foo-X can not be included as is because: [doesn't fit mission goals| it breaks Fooe-Y | fails test "Blooey" | not enough docs | doesn't follow coding conventions | haven't had enough time to check it out | significant performance penalty | etc...]"

-- MattWilkie - 05 Aug 2003

I'd like to expand on "Define and implement a test framework that allows a rapid inclusion of new features".

Lack of a test framework is a major problem for plugin authors as well; we have no way of finding out if we've broken the core functions.

When we come to specify the processes for contributions to the core, I really, really, really want to see a formal strategy for unit and integration testing that addresses this. There have to be tests that give us the confidence to say "well I changed it, but the old stuff still works" really quickly. That way a patcher can also answer the 4-week-wait question described above for themselves, and the core team can focus on arhcitectural goals. Also, bug-finders can submit test cases that demonstrate the problem, thus accelerating debug.

FWIW In our shop we use JUnit, PerlUnit and HttpUnit extensively so I favour the use of PerlUnit (Test::Unit)

-- CrawfordCurrie - 06 Aug 2003

I am not so sure that a fork would be "bad". A fork would be beneficial if we fork to start the design for the Next Generation TWiki (see TWikiNG or TWikiOO).

I feel that the modularization of the TWiki API is now the most important issue for the health of TWiki and to keep it competitive. A clear modularization also allows for sub-project delegation and for a wider expansion of the Core team.

My suggestion is: Officially start the TWikiNG fork (and stop doing evolutionary releases on the old code, except for support)

By officially starting a fork for the TWikiOO some issues will be solved:

  • we start working on the object-oriented version (some part of TWiki is already OO)
    • this way we finally obtain a mod_perl usable twiki (and we obtain twiki farms and so on)
    • contributors will love to contribute to it
  • all the contributors could focus on the big new step
  • the fork is "official" ... that is it's the "new version of TWiki"

-- AndreaSterbini - 06 Aug 2003

It is process change that will make the difference between needing a CodeFork and needing a ProjectFork. This is crux of the issue that will convince me that it is better for me to continue to participate with TWiki-as-a-project rather than say, taking the code and using it to write a TWiki-compatibility layer on top of, say, CgiWiki.

It is cruical that we define a TWikiRoadMap that defines what we are trying to acheive in each release and that we use this to contain and date the scope of each release. Further to the plans I outlined in TWikiVision back in Sept 2002, I hope this time the CoreTeam and the CodevCommunity will contribute, work through and eventually agree on what our priorities should be.

I look forward to a wider community being entrusted with the rights to make process improvements to WebPreferences in the new webs here at TWiki.org; I also encourage the community to continue to challenge the processes we have in place - for example does the submit-a-patch system that we have actually work? I venture it doesn't and won't. Once we admit our weaknesses we can seek improvement; in this case simplifying the processing of uploaded patches might lead us to the adoption of a tool such as BitKeeper or Aegis.

Process wise, the changes we need to evolve TWiki are revolutionary. They will require a lot of directed and consistant effort to create lasting change and enduring competitiveness. We risk falling back into the same patterns and I, for one, need reassurance in the form of specific, visible, actions from the whole community that we won't fall back so to be willing to exert the required energy.

I eagerly await details of the plans that Peter and the Cairo Core Team have been working on over the past few weeks.

-- MartinCleaver - 09 Aug 2003

I also disagree that a fork is a bad thing. Why? Lots of people run forks of TWiki code already - the question is are they FriendlyForks or UnfriendlyForks ? For example, the Alan Cox and Robert Love forks of the linux code trees haven't result in death & destruction (if we're talking about PropagandaWithFearAppeal wink ) and indeed have been highly beneficial - new code has been trialled extensively by these trusted code trees resulting in stable code that Linus trusts for inclusion.

The key point about these FriendlyForks is that they're willing to risk a modicum of instability (or Chaotic amount of instability in the case of a .odd release) in favour of actually resolving interaction issues.

I've been considering opening my TWiki code tree for sometime, largely because I've noticed that there's features I use that others have been crying out for for some time that I think will only make it into the core (or similar) if there's active demonstration that features are useful and beneficial. I don't think it would please everyone and it certainly wouldn't be the rewrite that David and Andrea mention, but it might be a step? A fork is only bad if there is no way of merging forks, and forked code trees as you move down the road.

Incidentally I echo Crawford's comments about unittesting - test driven programming is quite frankly IMO the best idea since sliced bread. Anyone who is a programmer really should read the articles by Robert Martin in sdmagazine to get a good idea of just how great a method it is. (representative article) TWiki would need a full set of tests retrofitting in addition to the existing tests however, with is a large undertaking.

-- MichaelSparks - 13 Aug 2003

I think rearchitecturing the existing TWiki codebase is a great idea. I personally believe that complete rewrites are almost never worth it unless your aim is to create a completely different product for all intents and purposes. I hope and believe that this new initiative will really jumpstart TWiki's development and growth.

-- WalterMundt - 15 Aug 2003

As promised, we are now rolling out the first three steps:

  1. Define clear roles and responsibilities, see CoreTeamRoles
  2. Get new Core team members on board with clear roles, see DevelopersNews
  3. Officially roll out CoreTeamHallOfFame

Crawford's comments are right on target. A new TWikiTestInfrastructure will help in getting patches accepted more quickly.

I added some more comments on the "what is not working" in regards to TWiki functionality.

Oh, my wireless network card stopped working for the second time within 30 minutes. Too hot in San Jose, I had to hold the PC card on the side of the fridge to cool it down, and now it work again smile Now I got a big fan next to my notebook computer, lets see if I can work uninterrupted...

-- PeterThoeny - 18 Aug 2003

A further point on forks - many people run forks already - the core TWiki.org codebase has been pretty incompatible with my tree for around 3 years now - and the majority of features I've submitted back have normally been merged in an incompatible fashion with my data/code tree. (Normally resulting in me having to go through all the data in my webs to fix the problem)

The overheads in coding for TWiki are generally pretty high (it generally means running a private fork by definition since the majority of people do not have write access to CVS, also partly due to the very large number of globals). Given this there is only a slight difference in overhead, when compared to running a public FriendlyFork, since the majority of discussions should continue to run in TWiki.org.

Also the idea that it's detrimental to TWiki.org I'm not sure is certain - I've been running a (private) fork for 3 years now. Aside from when I gave up trying to contribute due to busted process 2 years ago (encouraged back by a fervent TWiki Evangelisor...), I've tried to contribute code, effort, ideas, and encourage other developers. That's not the sort of thing that happens with the Dragonfly vs FreeBSD vs NetBSD vs OpenBSD forks I know, but it is more like the sort of thing that happens with the Linux tree forks. (One is fragmentary, the other is complementary)

-- MichaelSparks - 18 Aug 2003

Even a FriendlyFork wasts efforts ...

My comment above is meant as "a fork within TWiki", i.e. as the planning and development of the next release of TWiki, exactly as the Linux community does with the "development track" and the "stable track" (odd and even minor version numbers)

-- AndreaSterbini - 29 Aug 2003

The Alan Cox tree (now completely merged with the mainline) was extremely beneficial to the core tree and is an example for a FriendlyFork at it's best. Having an internal fork - an unstable/development tree - inside TWiki is clearly a good thing. Alan's tree is however not the only one and the manifold of them have not detracted from the core linux tree in the slightest. Example FriendlyFork's in linux: SuSE's tree, Redhat's tree, Alan's tree, Robert Love's tree, etc. I would argues that SuSE & Redhat's trees are the most widely deployed and often test code long before it makes it into the core - for example alsa.

I would hardly argue that the man-years (man-decades? man-centuries?) that Alsa & KDE have gained as a result of SuSE's efforts and distribution of their work are part of the reason why these things work so well. eg alsa has been used by SuSE users for a year or two now, and is only finally going into the core tree. Has this detracted from the Core linux? No! Has it benefited the core linux tree? YES!

Examples of forks going bad in terms of not working include many of the BSD's the original unix code, and ugliness cause by "people" like SCO (spit!). These examples don't work because the teams stop making it possible to work together and merging code. (And in the last case by apparently claiming ownership of everything anyone else has done)

If non-internal forks aren't beneficial, why has Peter been accepting code, features and bug swats from my fork for months now? He hasn't taken the installer - which he doesn't even have to GPL since it's BSD'd to him/you/CoreTeam and contains no TWiki code, but enough people have taken it to make it worth releasing. (An installer you'll not takes pains to stay compatible with the core TWiki tree despite resulting in a rather different install from normal)

-- MichaelSparks - 29 Aug 2003

I think forks are best viewed on a continuum, based on their size (how much has changed), compatibility (can they both take same patches, use same data, etc), and installed base (how many installations of the forked code), as well as friendliness (do they cooperate with the main project). The fork terminology below is my own, by the way.

Any TWiki installation that has customised the code, or even the bundled webs, can be viewed as a 'mini fork' that remains small, compatible, low installed base (usually within one company only), and friendly. My company's installation is like this and I'm sure the same is true for many others.

Those with more significant code changes can be considered 'midi forks' that are similar in compatibility, installed base, etc. They are also called TWikiClones.

Those that package up their changes and distribute them to others are 'real forks' in some sense, since a potential TWiki user could use the core project code or the forked code. This can waste effort, but if they remain friendly they can be beneficial to the core code, by enabling larger changes to be well tested by a number of installations using the forked tree, before inclusion in the core. The various Linux trees from Alan Cox and others are 'friendly real forks' in this terminology. It may or may not be an accident that friendly real forks are associated with 'bazaar style' developments such as Linux (see Google:cathedral+bazaar for details).

The real problem is 'unfriendly real forks' - they are large enough to attract a significant developer base, yet the cooperation is not there for whatever reason. NetBSD and OpenBSD started as somewhat unfriendly forks, though in practice some coders are on both projects' core teams.

The various sorts of forks can be beneficial as long as they remain cooperative with the core project, but the more ambitious they get, the less compatible they are likely to be even if the will to cooperate is there. It's generally better for everyone if the forks are relatively close to the core project (as with the Linux trees and distros), experimenting with new concepts and features and then folding code back into the core. Keeping such forks as branches within the core project is the ultimate in friendliness, compatibility, etc, but they are probably best called something else.

Merging patches from an incompatible fork into the core project is always problematic - it's the right of the fork owner to remain incompatible, but it's also the right of the core project people to merge the patches in a way that doesn't reduce the incompatibility with the fork (perhaps because it would create upgrade issues for core users.)

-- RichardDonkin - 29 Aug 2003

Good points. I think on the point of incompatibility - by definition any two forks will be incompatible. However it's how that incompatibility is managed that is important. A FriendlyFork will(should) try to a) retain compatibility with all the functionality in the core production base whilst b) adding functionality which isn't in the core production base for user testing and if things are useful/stable c) put forward for inclusion in the core production base - if those running that core production base think it's worthwhile.

ie a FriendlyFork doing it's job right will have lots of experimental features being played and toyed with, only some of which will be accepted into the core production base. A FriendlyFork done wrong would have a) have very few different useful features, or b) not have any/many features being accepted back because they're not being submitted back for pondering, because only bad features are being submitted or "poorly coded or inappropriate" features submitted back. Furthermore is a FriendlyFork is doing it's job right, the features will have been tested more than the average patch (including mine) that gets attached to TWiki.org.

That's why the key difference I laid out between friendly and unfriendly on the two pages was compatibility level and effort taken to maintain compatibility. Peter and the rest of the CoreTeam do good stuff, which is why I want to open my fork - so that I help offload their work. (after all it gives a clear way of testing patches often and early) After all things like data and code separation and the TWiki installer aren't likely to get included until they've proven their worth to several people (which is sensible), but they can't do that unless people can easily use them.

Same goes for overlapping named include sections - Peter and a few others can't see the worth (which again, I have no problem with), whereas I'm using them to have end user themeable skins - which is extremely nice. (eg one user can have a theme that looks like c2.com, and another has classic skin look, and another has a completely different one - and each user only defines their look on one page) Lots of people have expressed doubts at the way precompiling of pages could/should work. Again, it's never going to get into a core distribution unless the CoreTeam see it working, and not just for one person, but enough people to show it's worth. It is sufficiently complex to require a separate fork to make it easy for people to test however.

All the above IMO of course.

If I can't even release my private fork as a FriendlyFork (if the CoreTeam REALLY don't want help so much) I'd rather (privately) write something else from scratch instead rather than run an UnfriendlyFork.

-- MichaelSparks - 29 Aug 2003

TWiki.org wiki pages load way too slowly from this SourceForge server. This greatly hampers and discourages participation here. We should move the TWiki.org wiki to a comercial Web host if SourceForge cannot speed up our page loads to a normal and usable rate. Heack, I could serve TWiki.org faster from my PIII server over my static IP DSL account and would be willing to. A $20/month Dreamhost.com or HE.com hosting account would be more like it though. Our site's speed at SourceForge is grossly disfunctional :-(. We could continue using SourceForge for what it is good for and just host the wiki itself elsewhere I am sure.

-- RogerChrisman - 25 Sep 2003

Yes, this is true. It really takes (too?) much time and determination to post on TWikiDotOrg...
One might argue, that this sloooow preview/save cycle helps you to calm down and consider your arguments thoroughly wink
Recent history shows, that this does not help frown

So I urge to find the ReasonForTwikiOrgSlowPerformance -- and fix it, of course.

-- PeterKlausner, 25 Sep 03

RichardDonkin mentioned recently (in CoffeeBreak I think) that for a limited time Dreamhost is offering a $10/mth special of their $80/mth package. I think we should take advantage of it. As a matter of fact, I pledge $20 towards that right now (see ReasonForTwikiOrgSlowPerformance)

-- MattWilkie - 25 Sep 2003

Edit | Attach | Watch | Print version | History: r21 < r20 < r19 < r18 < r17 | Backlinks | Raw View | Raw edit | More topic actions
Topic revision: r21 - 2004-11-12 - SamHasler
 
  • Learn about TWiki  
  • Download TWiki
This site is powered by the TWiki collaboration platform Powered by Perl Hosted by OICcam.com Ideas, requests, problems regarding TWiki? Send feedback. Ask community in the support forum.
Copyright © 1999-2017 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.