Asking Bing Chat AI to reference Wikidata
I previously compared Chat GPT and the Bing chat AI when it came to the question of “What is Wikibase Cloud”. This comparison and further discussion highlighted problems with using Chat GPT alone. It seems to like inventing URLs that look right but have never existed, as it primarily wants to have a good conversation.…
Wikimedia Enterprise: A first look
Wikimedia Enterprise is a new (now 1-year-old) service and offered by the Wikimedia Foundation, via Wikimedia, LLC. This is a wholly-owned LLC that provides opt-in services for third-party content reuse, delivered via API services. In essence, this means that Wikimedia Enterprise is an optional product that third parties can choose to use that repackages data from within…
A first Wikidata query service JNL file for public use
Back in 2019 I wrote a blog post called Your own Wikidata Query Service, with no limits which documented loading a Wikidata TTL dump into your own Blazegraph instance running within Google cloud, a near 2 week process. I ended that post speculating that part 2 might be using a “pre-generated Blazegraph journal file to…
Wikidata query service updater evolution
The Wikidata Query Service (WDQS) sits in front of Wikidata and provides access to query its data via a SPARQL API. The query service itself is built on top of Blazegraph, but in many regards is very similar to any other triple store that provides a SPARQL API. In the early days of the query…
WikiCrowd at 50k answers
In January 2022 I published a new Wikimedia tool called WikiCrowd. This tool allows people to answer simple questions to contribute edits to Wikimedia projects such as Wikimedia Commons and Wikidata. It’s designed to be able to deal with a wide variety of questions, but due to time constraints, the extent of the current questions…
Wikidata maxlag, via the ApiMaxLagInfo hook
Wikidata tinkers with the concept of maxlag that has existed in MediaWiki for some years in order to slow automated editing at times of lag in various systems. Here you will find a little introduction to MediaWiki maxlag, and the ways that Wikidata hooks into the value, altering it for its needs. As you can…
Wikibase a history
I have had the pleasure of being part of the Wikibase journey one way or another since 2013 when I first joined Wikimedia Germany to work on Wikidata. That long-running relation to the project should put me in a fairly good position to give a high-level overview of the history, from both a technical and…
Profiling a Wikibase item creation on test.wikidata.org
Today I was in a Wikibase Stakeholder group call, and one of the discussions was around Wikibase importing speed, data loading, and the APIs. My previous blog post covering what happens when you make a new Wikibase item was raised, and we also got onto the topic of profiling. So here comes another post looking…