Addshore

It's a blog

Tag: mediawiki (page 2 of 2)

MediaWiki CRAP – The worst of it

I don’t mean Mediawiki is crap! The Change Risk Anti-Patterns (CRAP) Index is calculated based on the cyclomatic complexity and code coverage of a unit of code. Complex code and untested code will have a higher CRAP index compared with simple well tested code. Over the last 2 years I have been tracking the CRAP index of some of Mediawikis more complex classes as reported by the automatic coverage reports, and this is a simple summary of what has been happening.

Continue reading

Misled by PHPUnit at() method

So it turns out the at() method doesn’t quite do what I had initially thought….

I have recently been working on some tests for the new Newsletter extension for Mediawiki, specifically to test the NewslettterTablePager class. This thing extends the TablePager class in Mediawiki which is designed to make displaying information from a database table on a special page on a mediawiki site easy, and also easily enable things such as sorting.

The code interacts with the database and gets a ResultWrapper object, and the Pager uses the numRows(), seek() and fetchObject() methods, all of which I thought would be incredibly simple to mock.

Attempt 1

My first attempt where I first notice I have been thinking about the at() method all wrong can be seen below:

This methods returns a mock Database that the Pager will use. As you can see the only parameter is an array of objects to be returned by fetchObject() and I am using the at() method provided by phpunit to return each object at the index that it is stored in the array. This is when I discovered that at() in phpunit does not work in the way I first thought…

at() refers to the index of calls made to the mocked object as a whole. This means that in the code sample above, all of the calles to numRows() and seek() are increasing the current call counter index for the object and thus my mocked fetchObject() method is never returning the correct value or returning null.

Attempt 2

In my second attempt I made a guess that phpunit might allow multiple method mocks to stack and thus the return values of those methods be returned in the order that they were created. Thus I changed my loop to simply use any():

But of course this also does not work and this result in the same $resultObject being returned for all calls.

Final version

I ended up having to to do something a little bit nasty (in my opinion) and use returnCallback() and use a private member of the testcase within the callback as a call counter / per method index:

Notes

It would be great if phpunit would have some form of per method index expectation!

Some of the code examples here are slightly cut down, the full change can be seen at https://gerrit.wikimedia.org/r/#/c/234727/ in the NewsletterTablePagerTest file.

If only I had Googled the issue sooner! http://www.andrejfarkas.com/2012/07/phpunit-at-method-to-check-method-invocation-at-certain-index/ (Blog post now dead :()

Removing use of Mediawiki’s API ‘rawmode’ in Wikibase

Rawmode was a boolean value used to determine if an API result formatter in Mediawiki needed extra metadata in order to correctly format the result output. The main use of said metadata was in the XML output of the Mediawiki API. How hard can removing it be? This is the story of the struggle to remove the use of this single boolean value from the Wikibase codebase.

Overview

The first commit for this task was made on the 6th July 2015 and the final commit was about to be merged on the 27th August. So the whole removal took just under 2 months.

During this two months roughly 60 commits were made and merged working towards removal.

Overall 9290 lines were removed and 5080 lines were added.

I’m glad that is all done. (This analysis can be found on Google sheets). Sorry there are not more pictures in this post…..

Reason for removal

Well, rawmode is being remove from Mediawiki to remove API complexity. Instead of having to check what the API formatters need they will instead just accept all metadata and simply use what they need and discard the rest.

The change to “Finish killing ‘raw mode'” can be seen on Gerrit and has been around since April of this year. The relevant task can be found on Phabricator.

Process overview

The first step on the path was to remove the old serialization code from Wikibase (otherwise known as the lib serialization code) and replace all usages with the new WikibaseDataModelSerialization component. This component was already used in multiple other places in the code but not in the API due to its reliance on the way the lib serialization code handled the rawmode requirement of the API at the time.

Removal of the lib serialization code was the the first of the two major parts of the process and after around 50 commits I managed to remove it all! Hooray for removing 6000 lines with no additions in a commit…

The next and final step was to make the ResultBuilder class in Wikibase always provide metadata for the API and to remove any dirty hacks that I had to introduce in order to kill the lib code. Again this was done over the course of multiple commits, mainly adding tests for the XML output which at the time was barely tested. Finally a breaking change had to be made to remove lots of the hacks that I had added and the final uses of raw mode.

The final two commits can be seen at http://gerrit.wikimedia.org/r/#/c/227686/ and http://gerrit.wikimedia.org/r/#/c/234258/

Final notes

Look! You can even see this on the GitHub code frequency graph (just….)

You can also find my draft API break announcement post here.

— edit —

It looks like I did break 1 thing incorrectly: https://phabricator.wikimedia.org/T110668 , thought a fix is on it’s way our to the beta site! :)

MassAction Mediawiki extension

MassAction is a Mediawiki extension that allows users to perform mass actions on targets through a special page making use of the job queue. Its development started at some point in 2014 and a very rough experimental version is now available. Below are the basics.

Basic Concepts

  • Tasks are individual mass actions, comprised of smaller actions that are applied to multiple targets using matchers, for example replace the word ‘hello’ with ‘goodbye’ on all wiki pages with a title that contains the word ‘Language’ for pages that are not redirects.
  • Actions are processes that can be applied to Targets to alter some of the data that they contain, for example, change the title (move), change an article text etc.
  • Matchers or Filters are sets of rules that are used to match certain targets, for example all articles that contain the word ‘hello’.

All of these concepts are stored in new database tables (seen below).

Appearance

The main interaction with the extension is done through a special page. This page allows the creation of tasks as well as the viewing of previously created tasks and various actions such as saving changes.

A wire frame showing task creation can be seen below This allows for basic information about the task such as what type of target we want to change, this could be an Image, Article, Wikibase item etc. It also allows for a summary of the changes that will be made.

The lower sections of the page allow for the input of an unlimited number of Actions and Matchers/Filters to be added.

The version of the special page that allows users to view tasks is slightly different and can be seen below.

The main differences here are that no new data can be added, it is simply presented. And also a Task state and list of targets is now present.

Upon creation of a Task the Task will make its way through various states (seen below).

Once the targets have been found they will appear in the targets list on the special page and users will be able to either save changes individually or save a whole list of changes.

Current State

The code is currently stored on Wikimedia’s Gerrit tool and is mirrored onto GitHub. All issues are now tracked in Phabricator and the current workboard can be found here.

A screenshot of the current special page for task creation can be seen to the right.

Of course at this early stage lots of things are missing and I hope I find the time to work on this over the next year:

Wikimedia Hackathon 2015 (Lyon)

By Jean-Philippe Kmiec & Sylvain Boissel (Own work) [CC BY-SA 4.0], via Wikimedia Commons

This years Wikimedia Hackathon was located in Lyon, France at Valpré-Lyon between the 23rd and 25th of May.

The hotel (Valpre-Lyon) was absolutely beautiful with large grass areas, great architecture and a place for you weather you wanted to have a large or small discussion, sit quietly or sit outside. As well as Pétanque, table tennis was also available as well as plenty of people to meet!

Valpré Castel

Some of the hackathon grounds. By Alex Cella (Own work) [CC BY-SA 4.0], via Wikimedia Commons

I planned on primarily hacking on my MassAction extension along with one of two others but as at any hackathon I got massively distracted talking to people and working on other projects. Continue reading

Quick overview of Orain

Orain is a community-driven, not-for-profit wiki network that I help to maintain.

It runs Mediawiki and has been around for the past couple of years. Over the years it has been hosted on VPSs from multiple different providers and its technical layout has changed massively from each provider. Below I will try to summaries it’s current layout! This will include:

  • The machines / VPSs – ( how many there are, what they are doing and why )
  • The management of these – ( using ansible )
  • The configuration of Mediawiki – ( how Orain is running the ‘wikifarm’ )

Continue reading

Mediawiki Developer Summit 2015

The MediaWiki Developer Summit is an invitation-only event with an emphasis on the evolution of the MediaWiki architecture and the Wikimedia Engineering goals for 2015.

It took place on Monday, January 26 and Tuesday, January 27, 2015 in the Mission Bay Center, San Francisco, California.

Overall it was very interesting and a lot of good discussion happened. Again I am writing this post months after the event so sorry for the lack of content.

Continue reading

Wikimania

Wikimania 2014 was a 2000+ person conference, festival, meetup, workshop, hackathon, and celebration, spread over five days in August 2014, preceded and followed by fringe events. Wikimania is the official annual event of the Wikimedia movement, where one can discover all kinds of projects that people are making with wikis and open content, as well as meet the community that produced the most famous wiki of all, Wikipedia!

The core event was held in and around The Barbican Centre in London, UK.

Watch the videos on YouTubeCommons or LiveStream.

Read about it by following one of the following links:

Also you can find a blog post looking back at Wikimania from the Barbican Centre here.

Post photo from https://wikimania2014.wikimedia.org/wiki/File:Wikimania_2014_group_photo.jpeg

At future events these posts will likely be much better, as I’ll be writing them while I am there! This post was actually written in April 2015 :/

Mediawiki Hackathon Amsterdam

The Mediawiki Hackathon in Amsterdam was my first ever Mediawiki or Wikimedia hackathon, and it was great!

Unfortunately I am actually writing this post in 2015, so I don’t remember many of the details and will instead link to other places!

Wikimedia blog post reflecting on the hackathon

Wikimedia Netherlands, Wikimedia Germany, and the Wikimedia Foundation subsidized travel and accommodation for dozens of participants, enabling the highest participation in this event’s history.

Wikipedia Signpost report

Though it is difficult to pin down a central theme for a conference with 140 attendees, the choice of workshops suggest consolidation: Wikidata, Lua (slides), and Wikimedia Labs (slides) are hardly new projects and were all demo’ed at last year’s hackathon. Nevertheless, there was plenty to talk about, with upbeat developers leaving sessions excited at the progress the Foundation has made with each.

More links

Newer posts

© 2018 Addshore

Theme by Anders NorenUp ↑