Redirects in Wikidata are basically the same as redirects on normal wiki pages. However, they have a slightly different meaning and intention.
The main reason why we need redirects is because we want to provide stable identifiers.
The merging of two items has been a common place task since Wikidata gain momentum as often two different people or bots create items representing the same topic or concept, thus the data from one needs to be moved into the other and the empty item removed.
The problem with this is that removing the empty item of course means that one of the two identifiers no longer points to the topic or concept that is represents. This is therefore not a stable identifier..
So last night I took another look at the ever-increasing list of domains that are blocked, due to various court orders, by various ISPs in the UK.
A first look
My first port of call was Wikipedia which has an article titled ‘List of websites blocked in the United Kingdom‘. Now this list, although referenced, doesn’t really contain all domains that are blocked. Luckily the article does include various other links.
From the Wikipedia article I then found the wiki of 451unavailable.org.uk which lists all current UK blocking orders. There is a wiki page for each blocking order, for example UK/temp plixid which lists 17 sites. Again each of these pages contains lots of links, and the main set of links here are to check which ISPs are currently blocking the given domain.
This brings us to blocked.org.uk.
Below is by far the coolest thing I have found to date (even though it takes a little while)
In 2013 and 2014 I made a few presentations to various groups of people talking about Wikidata.
When creating those presentations I used as many graphical representations of the data in Wikidata as possible to try and give people an clearer picture of what is already stored.
One of the best visualisations at the time was the Wikidata map created by Denny Vrandečić which came after the introduction of coordinate locations to Wikidata.
Below you can see a GIF showing the additions of the coordinate location property to Wikidata items over roughly the first 40 days of enabling the coordinates data type.
Below are some of the images that I extracted from the full map for use in my presentations. Although they are now quite outdated they are still great to look at!
MediaFire (read about me on wikipedia) is a file hosting, file synchronization, and cloud storage service. MediaFire was first founded in 2006 but in 2014 it did something that really caught my eye. They increased their baseline storage service from 100GB to 1TB and reduced the price down to just $2.50 a month.
With the price beating every other easy to use cloud storage service out there I jumped on the deal (which was even cheaper when committing to a whole year) and started using it for my backups and data archives.
Digital Ocean is a ‘cloud’ hosting provider focusing on VPSs with SSDs called droplets.
They are generally rather cheap for what you are getting and their entry level droplet with 512MB memory, 1 core, 20GB disk and 1TB transfer only costs $5 per month ($0.007 per hour).
Orain, one of the projects that I am currently helping out with, currently powers all of its services with a handful of these droplets, and through the maintenance of these services I realise more and more that aspects of Orain really are not suited to live on DO.
Orain is a community-driven, not-for-profit wiki network that I help to maintain.
It runs Mediawiki and has been around for the past couple of years. Over the years it has been hosted on VPSs from multiple different providers and its technical layout has changed massively from each provider. Below I will try to summaries it’s current layout! This will include:
- The machines / VPSs – ( how many there are, what they are doing and why )
- The management of these – ( using ansible )
- The configuration of Mediawiki – ( how Orain is running the ‘wikifarm’ )
GitHub tracks the number of downloads for all assets (files) that are attached to a release, but GitHub currently makes it very hard for users to get at this information. The number of downloads is currently only accessible through the API.
I noticed this many months ago when wondering how many people were downloading the new C++ version of Huggle. At the time I started coming up with some odd little script that I could set running somewhere checking the download counts and updating a static list, I even thought about just uploading the build files to some other site that tracked this information.
A few days ago I took my first look at developing chrome extensions for a referencing tool for Wikidata, and thus today I thought, why not just make an extension for chrome that adds the download counts to the GitHub releases page!
Now if you have ever come across issues caused by non strict comparisons before then this is going to seem like a piece of cake, but remember, everyone makes mistakes.
Strict equality compares two values for equality where the values must also be the same type. Non strict or loose equality compares values while not caring about their type. Strict equality is almost always the correct comparison to use, and if it is not used when it should be many unexpected things can happen.