Penguin
Note: You are viewing an old revision of this page. View the current version.

You can't put a : in a line that starts with a ;. And in new markup, you can't put another: in the second line. Reini's links were all eaten when this page changed to new markup. -- CraigBox


Can someone tell me how to make a wiki page for /dev/random?? I'd thought the page would be http://www.wlug.org.nz/%2Fdev%2Frandom, but the wiki doesn't like that. StuartYeates

Just like that -- ReiniUrban

Turns out that there is a bug in our installed version of phpwiki that strips a leading "/". If/when we upgrade to a newer version this should be fixed. -- JohnMcPherson


A mode suitable for including GPG keys and other ascii armored texts would be nice. StuartYeates

New markup and < verbatim > - however remember by nature a Wiki is editable and there is no concept of page ownership. -- CraigBox

Latest PhpWiki releases support all that. -- ReiniUrban

Using referrer.log do a nightly/hourly import of all sane looking (ie: not google or wlug.org.nz) /backlinks/ into the backlinks page from the web. This will give an interesting insite into who/where are linking to us and will be an automagic way to add 'content' to pages, in a backwards kinda way.

Be sure to stop google crawling these links, as it'll drop our pagerank?

I suggest dating each backlink as it goes into the database, and purging links that haven't been updated for a nominal amount of time, (a month or three). Another way to do this is use a token bucket, add N days till expiry (10-30?) for each visit from that link, then order by decending token counts. This will rank the more popular pages at the top, and make a slashdotting stand out for instance.

If there is a way to make apache do this in real-time, this would also be great, then just have a nightly SQL statement removing a token from each bucket, then removing empty links. --JamesSpooner

Wiki is already a target of referer spam, people who put a 1x1 img or iframe that links to the wiki so that they show up heaps on our referers. No idea why they're doing it, but a lot of our referers are from porn sites, so I'm not keen on encouraging this behaviour. -- PerryLorier

Since 1.3.10 we allow size=w x h attributes for images. Should we disallow too small sizes, say below 8x8? -- ReiniUrban

This simply means they're performing a denial of service, rather than some form of advertising. To say that it's not an option before exploring the possibilities is a little premature, the page with the links can be flagged no-crawl such that we don't 'link' to porn sites and can be linked off the main backlinks page such that only interested parties will view the links, with suitable disclaimers etc. This will add considerable value to the wiki. -- JamesSpooner


Quoting CraigBox on refactoring link lists to Category pages:

I agree, with the same reservations (which I don't care about as much any more and won't care about at all if the OrphanedPages script could ignores pages with no backlinks if they have a CategorySomething footer... ;)

I would suggest a two-fold condition for ignoring apparent orphans: they should contain a CategoryFoo link and the CategoryFoo page should exist.

--AristotlePagaltzis


Links from People?s' PersonalWiki? to all the pages they're immedately responsible, or some number of their recent edits.
-- StuartYeates

If you sign your pages, you have a link on your home page to that page. Some of us (eg me) have huge number of backlinks on our pages. I might go and clean mine out. Although this iis a curious idea. Would it only show recent changes? The way the wiki stores it's DB is pretty broken, so I imagine that this is harder than you might imagine.
-- PerryLorier

What I meant (as opposed to what I said) is that it would a neat idea to have some kind of idea what kind of Wikis someone had worked on just by looking at their homepage. It would be cool to see that you're a php/debian kind of person and i'm a java/compiler kind of person and that JoeBloggs used to be a windows weenie but hasn't been around for 6 months. I have no idea how hard this would be.
-- StuartYeates

It's not quite the same as what you've asked for, but I got the greenstone wiki collection to record the last author for each page, and they get grouped by author. So you can see which pages someone was the most recent editor for (at time of indexing). -- JohnMcPherson


Some way of automatically sucking in frequently updated HOWTOs and FAQs to ensure for correctness.
-- StuartYeates

How would we suck them in? How would we merge these changes with local changes?
-- PerryLorier


Maybe you want to use real fortune on php, instead of the simple taglines: http://www.aasted.org/quote/

I use it like this:

define('FORTUNE_DIR',"/usr/share/fortune");

if (empty($data[['%content'])) {
    include_once('lib/InlineParser.php');
    // A feature similar to taglines at from http://www.wlug.org.nz/
    // http://www.aasted.org/quote/
    if (defined('FORTUNE_DIR') and is_dir(FORTUNE_DIR)) {
        include_once("lib/fortune.php");
        $fortune = new Fortune();
        $quote = str_replace("\n<br>","\n", $fortune->quoteFromDir(FORTUNE_DIR));
        return sprintf("<verbatim>\n%s</verbatim>\n\n"._("Describe %s here."),
                       $quote, "[[" . WikiEscape($this->_pagename) . "]");
    }
    // Replace empty content with default value.
    return sprintf(_("Describe %s here."),
                   "[[" . WikiEscape($this->_pagename) . "]");
}

--ReiniUrban


CategoryMeta