*Apologies for cross-posting*
Hello all,
The Wikidata development team is currently doing some research to
understand better how people access and reuse Wikidata’s data from the code
of their applications and tools (for example through APIs), and how we can
improve the tools to make your workflows easier.
We are running a short survey to gather more information from people who
build tools based on Wikidata’s data. If you would like to participate,
please use this link
<https://docs.google.com/forms/d/e/1FAIpQLSfJ-I_Ib2EOuRVG4XfeUazhXTvgKsjcKhA…>
(Google Forms, estimated fill-in time 5min). If you don’t want to use
Google Forms, you can also send me a private email with your answers. We
would love to get as many answers as possible before June 9th.
The data will be anonymously collected and will only be shared in an
aggregated form.
If you have any questions, feel free to reach out to me directly.
Cheers,
--
Mohammed Sadat Abdulai
*Community Communications Manager for Wikidata/Wikibase*
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Hi all!
I’m Dan Shick ( https://w.wiki/RDs ), the new technical writer at
Wikimedia Deutschland. My goals are to discover, improve, unify and
round out documentation for the Wikibase & Wikidata development team;
my specific duties are defined by my team leadership and the
leadership of both products.
I see a lot of documentation out there, and it needs organizing so
that people of every audience can find the information they’re looking
for. Audiences include volunteers & the community, employees of
Wikimedia Deutschland and independent users of the products, and I see
plenty of overlap between those groups. Perhaps most importantly, if
the documentation someone needs doesn’t exist, I want to see it get
written.
My first task is to collect and improve the Wikibase post-install
documentation. I have a lot of resources already on the table, but of
course I welcome pointers to and feedback on any and all existing
documentation.
You'll find this text on my wiki page as well; if you want to say hi
or have any questions or comments, feel free to shoot me an email or
speak up on my talk page.
Wiki: https://meta.wikimedia.org/wiki/User:Dan_Shick_(WMDE)
Phabricator: https://phabricator.wikimedia.org/p/danshick-wmde/
--
Dan Shick
Technical Writer
Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Phone: +49 (0)30 219 158 26-0 (reception)
https://wikimedia.de
Stay up to date with news and stories about Wikimedia, Wikipedia and
free knowledge by subscribing to our (German) newsletter.
We envision a world where all human beings can freely share in the sum
of all knowledge. Help us achieve that vision! Donate at
https://spenden.wikimedia.de .
Wikimedia Deutschland – Gesellschaft zur Förderung Freien Wissens e.
V. Eingetragen im Vereinsregister des Amtsgerichts
Berlin-Charlottenburg unter der Nummer 23855 B. Als gemeinnützig
anerkannt durch das Finanzamt für Körperschaften I Berlin,
Steuernummer 27/029/42207.
Hi,
I wonder if there is any guidance about how to poll the recent changes
feed of a MediaWiki instance (in particular of a Wikibase one) to keep
up with its stream of edits? In particular, how to do this responsibly
(without hammering the server) and how to ensure that all changes are
seen by the consumer?
EditGroups (https://tools.wmflabs.org/editgroups/) currently uses the
WMF Event Stream to do this, which works well but has the downside of
not being available for non-WMF wikis, and the lack of server-side
filtering support, so I have been looking into implementing recent
changes polling in it, so it can be run on other wikis.
So far it looks like my RC polling strategy misses some edits that the
WMF Event Stream includes, so I need to improve this. RC polling is
implemented in the WDQS updater here:
https://github.com/wikimedia/wikidata-query-rdf/blob/master/tools/src/main/…
Is this the best implementation to look at?
And actually - is this really worth doing? Perhaps I should instead
require that the _target Wikibase runs the EventLogging extension
(https://www.mediawiki.org/wiki/Extension:EventLogging) which exposes
the edit stream in a Kafka instance, and then implement a Kafka topic
consumer in EditGroups. It does add requirements on the Wikibase
instance, but if RC polling is brittle, it would be wrong to promise
that EditGroups can be run off a stock MediaWiki instance anyway.
(Note that I still think EditGroups is not a long-term solution. We need
a MediaWiki extension to replace it:
https://phabricator.wikimedia.org/T203557. I am just looking into this
to help our OpenRefine GSoC intern Lu Liu who will be working on
Wikibase support in OpenRefine this summer.)
Cheers,
Antonin