This project page in other languages:

English | 日本語 | +/−

Shortcut: COM:BRFA

Bot policy and list · Requests to operate a bot · Requests for work to be done by a bot · Changes to allow localization  · Requests for batch uploads

If you want to run a bot on Commons, you must get permission first. To do so, file a request following the instructions below.

Please read Commons:Bots before making a request for bot permission.

Requests made on this page are automatically transcluded in Commons:Requests and votes for wider comment.

Requests for permission to run a botEdit

Before making a bot request, please read the new version of the Commons:Bots page. Read Commons:Bots#Information on bots and make sure you have added the required details to the bot's page. A good example can be found here.

When complete, pages listed here should be archived to Commons:Bots/Archive.

Any user may comment on the merits of the request to run a bot. Please give reasons, as that makes it easier for the closing bureaucrat. Read Commons:Bots before commenting.

TabulistBot (talk · contribs)Edit

Operator: Laboramus (talk · contributions · Number of edits · recent activity · block log · User rights log · uploads · Global account information)

Bot's tasks for which permission is being sought: This bot is a form of Listeria bot code intended to automatically maintain tabular data based on Wikidata SPARQL queries. Please see the bot homepage for more info.

Automatic or manually assisted: Automatic.

Edit type (e.g. Continuous, daily, one time run): The bot is run periodically and updates all pages that have bot template. Currently it happens once a day but as more templates get added we'll probably switch to hourly.

Maximum edit rate (e.g. edits per minute): Each template causes one edit per update. Frequency of updates is controllable, right now once a day.

Bot flag requested: (Y/N): Y.

Programming language(s): PHP.

Laboramus (talk) 00:13, 25 February 2017 (UTC)


  Agree I think this bot will provide a great way to bridge the gap between Wikidata and Wikipedia pages. For example, it will provide a way to generate lists and tables from complex sparql queries, and later allow for "historic" view - see what the data was before. In short, great work! --Yurik (talk) 00:58, 25 February 2017 (UTC)

Why do we need to do this with bot and not in Lua with automatic queues? If Wikidata API could not provide such functionality yet, it should be improved. --EugeneZelenko (talk) 14:47, 25 February 2017 (UTC)
Lua can't persistently store data, and some queries may be too heavy to run each time somebody edits a page. This provides a way to have data set that is updated less frequently than every edit and can be shared between pages or between different language versions of a page without additional costs. --Laboramus (talk) 00:16, 26 February 2017 (UTC)
But why do we need such copies on Commons? They could not be transcluded on other projects. May be we need dedicated server on WMF labs for such purposes? --EugeneZelenko (talk) 15:53, 26 February 2017 (UTC)
They don't need to be transcluded to other projects. Other projects need data, not tables. And data can be included with Lua. Whole tables can not, and should not be. That's the whole point - for Commons to be the repository, and for other projects to reuse the data, exactly as it happens with images. Relying on labs servers for production usage is not a good idea, and I don't see why we need separate labs server to do what Commons already meant to do. --Laboramus (talk) 21:54, 26 February 2017 (UTC)
Isn't Wikidata is supposed to deal with data and related things? --EugeneZelenko (talk) 15:04, 27 February 2017 (UTC)
Wikidata doesn't have a way to store tabular data, especially non-trivial amount of tabular data. Its data model is completely different. That's why tabular data on commons story was created. --Laboramus (talk) 02:10, 3 March 2017 (UTC)
I have to agree that doing this by a periodic volunteer-run bot seems odd. I'm not against the bot working as an interim solution, but I'd much rather see a more integrated approach. -- (talk) 09:59, 28 February 2017 (UTC)
Wikidata team has been discussing "query pages" - something similar to this approach, but I suspect it will be another year at best until they have something working... So yes, in the interim, a good way to aggregate data and make it available to all projects/graphs/tables. --Yurik (talk) 02:10, 3 March 2017 (UTC)
I don't understand yet how this can be useful for wikipedia pages. Can you show an example where these data is used or could be used? --Krd 08:38, 3 March 2017 (UTC)
Krd, you can generate a table with all countries: ISO-2 codes and names (in all languages). Afterwards, in every WMF wiki, a Lua script that needs to show a localized country name can use that table. So if you have a template that shows a few countries with some value for them, like number of votes they casted in some UN voting, you can use ISO-2 country code (US,FR,RU,...) and the lua script will automatically convert it to a full name in any language you need. Plus, it will do fallback (like if the country's name is not available in a given language, it can use English instead). And if someone updated Wikidata, all lua scripts out there will automatically use the update because this bot will update the table. In addition to the Lua scripts, all Graphs will also be able to draw things correctly. For example, if you have a map of the world with some stats on it, when the user moves the mouse over the country, you want to see that country's name, or show them in the legend on the side of the graph next to a value. All this data can come from just one single table above. --Yurik (talk) 04:41, 4 March 2017 (UTC)
Ok, in the end it sounds reasonable to me. How many pages to you expect to be maintained by this task at the end of 2017? Is there any reasonable way to update only pages that are used/referenced, in order to save some edits at pages that exist but are not used anywhere? --Krd 14:18, 11 March 2017 (UTC)
@Laboramus: ? --Krd 17:49, 22 March 2017 (UTC)
@Krd: Sorry, missed the ping. I don't expect a real lot of pages, probably dozens but less than 100 by the end of 2017. I can look into if the bot can check whether the page has incoming links, probably possible. --Laboramus (talk) 01:33, 28 March 2017 (UTC)