Well, that was random.

This week, I got the bright random idea to migrate the Avarthrel Encyclopædia from MoinMoin to MediaWiki.

Guess what? It happened. Guess what? It was much easier than going from MediaWiki to MoinMoin.

Why move away from MoinMoin?

Moin has a few advantages over MediaWiki. MediaWiki is developed by the Wikimedia Foundation, and it’s primary development concern is to get the software to work properly for Wikimedia’s needs. This means that there’s not that much effort to ensure that it works for everyone else’s needs, even thought they’re obviously neither actively hindering those things nor rejecting efforts to the contrary. It’s just that they’re clearly a second priority.

I was under constant worries about compatibility when I was using MediaWiki under PostgreSQL. Since avarthrel.org’s webhost supports MySQL, I’m trying to minimise the headache by sticking to it.

(Side note: Looks like MediaWiki supports MyISAM. Fuck that shit. InnoDB or bust.)

Another thing about MediaWiki is that it’s pretty difficult to set up, especially if you want a farm of small wikis or just make one spring up when it’s needed. Upgrading and maintaining a bunch of small wikis is up to you and it’s difficult as heck. If you want a single wiki, you’re good to go.

Also, MoinMoin’s editing interface is faster. Not that MediaWiki’s is far behind.

So why move?

MoinMoin doesn’t have much in the way of templating. MediaWiki has templates, and you can just go and create more in no time at all. MoinMoin has all sorts of expansion mechanisms, but they all require extremely ardourous Python trials.

MoinMoin 1.x isn’t based on a database, so there’s a few nasty limitations on what it can do easily. There is an article list, there is caching, there’s a search function, there ”sort of” are categories. And that’s just about it. I’ve heard that rumours say there’s some sort of a What Links Here functionality somewhere, but I’m not sure.

MoinMoin’s syntax is… well, not quite modern. MediaWiki’s syntax is in my personal opinion more elegant and more forgiving.

This was a small sticking point, but I think it needs to be addressed: The home wikis are different in nature. MediaWiki.org is well organised, clean, and informative. MoinMo.in is a mess of thins that were, are, might be, or one day in future shall be. It is a mess of conjecture, hypothesis, obsolete facts and even things that appear to make sense. And lots of talk.

But there were two big things that absolutely blow MoinMoin out of the water.

Semantic MediaWiki

I’m so happy that I’m back with Semantic MediaWiki. Short explanation: Articles have subjects which have properties of different types! And they can be expressed in writing in wikitext, just like any links! An article may have a mass of data that’s parsed from the text that the user sees.

Bot support

Legends has it that it’s possible to write Python tools using MoinMoin libraries that do various interesting kinds of stuff on the data. Good luck finding any documentation on that.

MediaWiki, on the other hand, has API. And various pre-built bots.

Long story short, I installed Pywikipediabot and started blowing shit up.

Uh oh, all these quote templates are screwed up. Run replace.py. Get rid of the MoinMoin metadata. Run replace.py.

Date templates should be added back, so that “1.II.345 AR” becomes “”. Surely that’s bloody impossible? Nope, add a few custom rules as “fixes”. Run replace.py.

How about adding categories to a bunch of pages? How the fuck do I produce a list of places that were located in various countries? Oh, I do have that stuff in the GIS database…

% sqlite3 avarthrel.sqlite
sqlite> .output /tmp/titles.txt
sqlite> SELECT name FROM locations WHERE country = 'Varmhjelm';
sqlite> .quit
% perl -i~ -pe 's/^(.*)$/[[$1]]/gi;' /tmp/titles.txt
% python add_text.py -file:/tmp/titles.txt -text:'[[Category:Cities of Varmhjelm]]' -summary:'Adding category: Cities of Varmhjelm'

…oh, shit I added a whole bunch of pages to a wrong category. Okay, I’ll just copypaste the list from the article to a file, run replace.py and tell it to get the article list from that file.

See? Processing 400 pages took 4 days.

I think I’m willing to forgive all sins MediaWiki has had in the past. It’s a great piece of software. Worst of all wiki software packages, if we ignore all other packages we’ve tried from time to time.