Back in 2013 maps were generated almost daily to track the immediate usage of the then new coordinate location within the project. An animation was then created by Denny & Lydia showing the amazing growth which can be seen on commons here. Recently we found the original images used to make this animation starting in June 2013 and extending to September 2013, and to celebrate the fourth birthday of Wikidata we decided to make a few new animations.
I has been another 5 months since my last post about the Wikidata maps, and again some areas of the world have lit up. Since my last post at least 9 noticeable areas have appeared with many new items containing coordinate locations. These include Afghanistan, Angola, Bosnia & Herzegovina, Burundi, Lebanon, Lithuania, Macedonia, South Sudan and Syria.
The difference map below was generated using Resemble.js. The pink areas show areas of difference between the two maps from April and October 2016.
Below you will find an internal WMDE presentation covering the general area of WMDE Metric & Data Gatherings from 2016.
This presentation follows on from the initial introduction to engineering analytics activities.
The presentation skims through:
- WMDE Grafana dashboards
- The Wikimedia Analytics landscape
- Grafana & graphite
- Hadoop, Kafka, Hive & Oozie
- EventLogging, Mysql replicas & MediaWiki logs
- Out Analytics scripts
- How to get access
Geospatial search is up and running for the Wikidata Query Service! This allows you to search for items with coordinates that are located within a certain radius or within a bounding box.
Along side the the map that can be used to display results for the query service this really is a great tool for quickly visualizing coverage.
I originally posted about the Wikidata maps back in early 2015 and have followed up with a few posts since looking at interesting developments. This is another one of those posts covering the changes since the last post, so late 2015, to now, May 2016.
The new maps look very similar to the naked eye and the new ‘big’ map can be seen below.
So while at the 2016 Wikimedia Hackathon in Jerusalem I teamed up with @valhallasw to generate some diffs of these maps, in a slightly more programatic way to my posts following up the 2015 Wikimania!
Wikidata is a multilingual project, but due to the size of the project it is hard to get a view on the usage of languages.
For some time now the Wikidata dashboards have existed on the Wikimedia grafana install. These dashboards contain data about the language content of the data model by looking at terms (labels, descriptions and aliases) as well as data about the language distribution of the active community.
For reference the dashboard used are:
All data below was retrieved on 1 February 2016
On the 28th of January 2016 all Wikimedia MediaWiki APIs had 2 short outages. The outage is documented on Wikitech here.
The outage didn’t have much of an impact on most projects hosted by Wikimedia. However due to most Wikidata editing happening through the API, even when using the UI, the project basically stopped for roughly 30 minutes.
Interestingly there is an unusual increase in the edit rate 30 minutes after recovery.
I wonder if this is everything that would have happened in the gap?
Wikidata provides free and open access to entities representing real world concepts. Of course Wikidata is not meant to contain every kind of data, for example beer reviews or product reviews would probably never make it into Wikidata items. However creating an app that is powered by Wikidata & Wikibase to contain beer reviews should be rather easy.
This is an internal WMDE presentation given at the start of “metric driven development” for the WMDE & Wikidata dev teams.
The presentation is pretty bare bones but it provides a general overview of the plan and links to the current state of things.
The main area of interest from the presentation is the resources slide copied below for convenience.
Recently some articles appeared on the English Wikipedia Signpost about Wikidata (1, 2, 3). Reading these articles, especially the second and third, pushed me to try to make a dent in the ‘problem’ of references on Wikidata. It turns out that this is actually not that hard!