The Wikibase registry was one of the outcomes of the first in a series of Federated wikibase workshops organised in partnership with the European research council.
The aim of the registry is to act as a central point for details of public Wikibase installs hosted around the web. Data held about the installs currently includes the URL for the home page, Query frontend URL and SPARQL API endpoint URL (if a query service exists).
During the workshop an initial data set was added, and this can be easily seen using the timeline view of the query service and a query that is explained within this post.
2017 has been a great year with continued work at WMDE on both technical wishes projects and also Wikibase / Wikidata related areas. Along the way I shared a fair amount of this through this blog, although not as much as I would have liked. Hopefully I’ll be slightly more active in 2018. Here are some fun stats:
- 7,992 page views
- 5,350 visitors
- 4 total posts (that’s terrible, but its April 2018 now and I already have 6 under my belt)
- 1840 words
Top 5 posts by page views in 2017 were:
- Guzzle 6 retry middleware
- Misled by PHPUnit at() method
- Wikidata Map July 2017
- Add Exif data back to Facebook images
- Github release download count – Chrome Extension
To make myself feel slightly better we can have a look at github and the apparent 1,203 contributions in 2017:
It’s time for the first 2018 installation of the Wikidata Map. It has been roughly 4 months since the last post, which compared July 2017 to November 2017. Here we will compare November 2017 to March 2018. For anyone new to this series of posts you can check back at the progression of these maps by looking at the posts on the series page.
Each Wikidata Item with a Coordinate Location(P625)will have a single pixel dot. The more Items present, the more pixel dots and the more the map will glow in that area. The pixel dots are plotted on a totally black canvas, so any land mass outline simply comes from the mass of dots. You can find the raw data for these maps and all historical maps on Wikimedia Tool Labs.
Looking at the two maps below (the more recent map being on the right) it is hard to see the differences by eye, which is why I’ll use ImageMagik to generate a comparison image. Previous comparisons have used Resemble.js.
While working on a new Mediawiki project, and trying to setup a Kubernetes cluster on Wikimedia Cloud VPS to run it on, I hit a couple of snags. These were mainly to do with ingress into the cluster through a single static IP address and some sort of load balancer, which is usually provided by your cloud provider. I faffed around with various NodePort things, custom load balancer setups and ingress configurations before finally getting to a solution that worked for me using ingress and a traefik load balancer.
Below you’ll find my walk through, which works on Wikimedia Cloud VPS. Cloud VPS is an openstack powered public cloud solution. The walkthrough should also work for any other VPS host or a bare metal setup with few or no alterations.
UPDATE 02/04/2018: Looks like AggregateIQ may have had a contract with Cambridge Analytica, but didn’t disclose it because of an NDA… But all spoilt by a unsecure gitlab instance. https://nakedsecurity.sophos.com/2018/03/28/cambridge-analyticas-secret-coding-sauce-allegedly-leaked/
I wonder why AggregateIQ state that they have never entered a contract with Cambridge Analytica, but don’t mention SCL. Except they do mention they have never been part of SCL or Cambridge Analytica…
Channel 4 report on Brexit and AggregateIQ
From the AggregateIQ website & press release:
AggregateIQ is a digital advertising, web and software development company based in Canada. It is and has always been 100% Canadian owned and operated. AggregateIQ has never been and is not a part of Cambridge Analytica or SCL. Aggregate IQ has never entered into a contract with Cambridge Analytica. Chris Wylie has never been employed by AggregateIQ.
AggregateIQ works in full compliance within all legal and regulatory requirements in all jurisdictions where it operates. It has never knowingly been involved in any illegal activity. All work AggregateIQ does for each client is kept separate from every other client.
I was looking for a new tool for easily visualizing git branches and workflows to try and visually show how Gerrit works (in terms of git basics) to clear up some confusions. I spent a short while reading stackoverflow, although most of the suggestions weren’t really any good as I didn’t want to visualize a real repository, but a fake set of hypothetical branches and commits.
I was suggested Graphviz by a friend, and quickly found webgraphviz.com which was going in the right direction, but this would require me to learn how to write DOT graph files.
This post relates to the WikiWhat Youtube video entitled “Adam Conover Does Not Like Fact Checking | WikiWhat Epsiode 4” by channel Cntrl+Alt+Delete. It would appear that the video went slightly viral over the past few days, so let’s take a quick look at the impact that had on the Wikipedia page view for Adam’s article.
Back in 2016 I wrote a short hacky script for taking HTML from facebook data downloads and adding any data possible back to the image files that also came with the download. I created this as I wanted to grab all of my photos from Facebook and be able to upload them to Google Photos and have Google automatically slot them into the correct place in the timeline. Recent news articles about Cambridge Analytica and harvesting of Facebook data have lead to many people deciding the leave the platform, so I decided to check back with my previous script and see if it still worked, and make it a little easier to use.
It has only been 4 months since my last Wikidata map update post, but the difference on the map in these 4 months is much greater than the diff shown in my last post covering 9 months. The whole map is covered with pink (additions to the map). The main areas include Norway, Germany, Malaysia, South Korea, Vietnam and New Zealand to name just a few.
This is a belated post about the Wikibase docker images that I recently created for the Wikidata 5th birthday. You can find the various images on docker hub and matching Dockerfiles on github. These images combined allow you to quickly create docker containers for Wikibase backed by MySQL and with a SPARQL query service running alongside updating live from the Wikibase install.
A setup was demoed at the first Wikidatacon event in Berlin on the 29th of October 2017 and can be seen at roughly 41:10 in the demo of presents video which can be seen below.