Wikidata:Bot requests/Archive/2017/06

From Wikidata
Jump to navigation Jump to search

Remove non-Dutch category labels

Request date: 5 June 2017, by: Sjoerddebruin

Task description

Remove Dutch (nl) labels on items that use Wikimedia category (Q4167836) when they don't start with "Categorie:".

Discussion

There are a lot of category items that have a Dutch label that isn't Dutch, thanks to a faulty bot task by Edoderoobot (talkcontribslogs) last year. Can someone clean this up? There are already more than 10000 items that have a Dutch label that starts with "Category:" for example. Sjoerd de Bruin (talk) 15:56, 5 June 2017 (UTC)

Request process
  • I have a ready script for this, which works as follows: tries to update the label from the related sitelink if such exists, otherwise removes unwanted label. I'll start to work on this in an hour or so. XXN, 17:35, 5 June 2017 (UTC)
 Doing… XXN, 18:58, 5 June 2017 (UTC)
  • ✓ Done Done. Only for few items (less than 30) was possible to update label from Dutch wikis sitelinks. For several hundreds of labels with numeric-only variable values (categories of years; e.g. Category:2020), I've just fixed the namespace prefix in label to "Categorie:". Per overal, more than 11000 labels with "Category:", ~1400 labels with "Catégorie:", and ~170 labels with "Kategorie:" were fixed/removed.
BTW, after first run last night SPARQL query began to time out even after decreasing limit to 100-200, and I had to switch to SQL. XXN, 13:07, 6 June 2017 (UTC)
This section was archived on a request by: XXN, 13:07, 6 June 2017 (UTC)

"and" in UK constituency names

For each instance of constituency of the House of Commons (Q27971968) with " and " in the name (note spaces), please crate an alias, replacing " and " with " & ". Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 14:41, 11 June 2017 (UTC)

Done. You got 300 new en-aliases. —MisterSynergy (talk) 15:07, 11 June 2017 (UTC)
@MisterSynergy: Very helpful, and prompt, thank you. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:07, 11 June 2017 (UTC)
This section was archived on a request by: Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:07, 11 June 2017 (UTC)

Bot to create top 10

This section was archived on a request by: PokestarFan • Drink some tea and talk with me • Stalk my edits • I'm not shouting, I just like this font! 21:38, 15 June 2017 (UTC)

Can a bot find items with no instance of (P31) property, put them in a table (below), header them with Top10 and today's date, and put the table into User:PokestarFan/Top10?

Can the said bot also archive tables from 1 week ago? PokestarFan (talk) (My Contribs) 20:14, 16 March 2017 (UTC)

Said Table: (Example items, don't reuse)

Item Done
Universe (Q1)
Earth (Q2)
life (Q3)
death (Q4)
human (Q5)
Q6
Q7
happiness (Q8)
Q9
Q10

Dates of constituencies of the French Fifth Republic

Request date: 17 June 2017, by: Tubezlob

Link to discussions justifying the request
Task description

A lot of constituency of the French Fifth Republic (Q15620943) use the properties start time (P580)/end time (P582) instead of inception (P571)/dissolved, abolished or demolished date (P576). Can someone fix that?

See this query from Oravrattas:

SELECT DISTINCT ?item ?itemLabel ?territoryLabel ?ordinal ?start ?end ?inception ?dissolution
WHERE { 
  ?item p:P31 ?statement .
  ?statement ps:P31 wd:Q15620943 . 
  OPTIONAL { ?statement pq:P1545 ?ordinal } 
  OPTIONAL { ?item wdt:P131 ?territory }
  OPTIONAL { ?item wdt:P580 ?start }
  OPTIONAL { ?item wdt:P582 ?end }
  OPTIONAL { ?item wdt:P571 ?inception }
  OPTIONAL { ?item wdt:P576 ?dissolution }
  SERVICE wikibase:label { bd:serviceParam wikibase:language "fr" . }
}
ORDER BY ?territoryLabel ?ordinal ?itemLabel
Try it!

Thank you.

Discussion


Request process

Task completed (18:06, 22 June 2017 (UTC)) Made with QuickStatements and PetScan.

This section was archived on a request by: Tubezlob (🙋) 18:06, 22 June 2017 (UTC)

"also known as" all redirects in wikipedias

Request date: 25 June 2017, by: מלא כל הארץ כבודי

can someone here build a bot that can add all redirects as "also known as" in the wikidata item connected to the article in wikipedia? specificly my request is specificly for the 114 items of the suras of the quran (starting from Al-Fatiha) which have plenty of redirects in hewiki due to different translation and transliteration, though i think it's relevant to all items and from all languages.

not all articles are writen yet, nor am i done adding redirects, so if the bot is not gaing to run on a daily/weekly/monthly basis then i'd be happy if after writing the code it wouldnt start running right away. thanks, melo kol haaretz kevodi (talk) 10:05, 25 June 2017 (UTC)

Please use our templates {{P|xxx}} and <code>{{Q|xxx}} when talking about specific items or properties. It makes it easier to follow what you are saying as there's no property that's called "also known as" in Wikidata. ChristianKl (talk) 10:34, 25 June 2017 (UTC)
ChristianKl, i don't think it's considered a property. in the head of every item there's a table which coontains the "label", "description" and "also known as". the label is used us the title of the item (along side wwith the q id), under the title in black appeares the description that was filled in, and under that appear the different fill-ins of "also known as" in grey, separated with |'s. i hop e someone here understands what i'm talking about even if you still don't. melo kol haaretz kevodi (talk) 10:58, 25 June 2017 (UTC)
They means aliases.
No objections for your specific request related to Al-Fatiha & hewiki, but  Oppose to mass imports of all wikis redirects into aliases. At least in rowiki in past were mass-created tons of redirects with all-lowercase to *any* page titles (including proper names), and we ended up with such redirects: Demografia r.s.f. iugoslavia → Demografia R.S.F. Iugoslavia (Demografia Republicii Socialiste Federative Iugoslavia), Statele unite ale americii → Statele Unite ale Americii, Aeroportul münchen franz josef strauß → Aeroportul München Franz Josef Strauß, etc. etc, which obviously are wrong formations and are un-needed both as WD aliases and wiki redirects. Some clean-up work was done here, but it's far from end yet. So, please don't blindly import redirects to WD aliases (at least from rowiki). XXN, 11:05, 25 June 2017 (UTC)
I now get what you mean. We commonly call this property "alias". In general, not all redirects make good a good alias. If "Jane Doe" isn't notable but her husband "John Doe" is notable and his Wikipedia page mentions "Jane Doe" there's frequently a redirect from "Jane Doe" to "John Doe". ChristianKl (talk) 11:06, 25 June 2017 (UTC)
right, forgot those kind of redirects. to the quran chapters un hewiki there are no such redirects, so can someone build for me a bot for those 114 items (right now less then 25 have articles and redirects) and run it in another two weeks or so? melo kol haaretz kevodi (talk) 11:45, 25 June 2017 (UTC)

i pull my request. no need for such a bot. i have such an option to add redirects of my choice. thanks, melo kol haaretz kevodi (talk) 15:13, 26 June 2017 (UTC)

This section was archived on a request by: Matěj Suchánek (talk) 07:25, 28 June 2017 (UTC)

Use a bot to create a page that contains a list of items for improving

Request date: 14 June 2017, by: PokestarFan

Link to discussions justifying the request
Task description

Use a bot to create 1 page that contains items that match all of the criteria mentioned.

  1. Item has almost no properties (only 1 or two)
  2. Item has a label in English
Licence of data to import (if relevant)
Discussion
Use ListeriaBot, eg.
{{Wikidata list
|sparql=SELECT ?item WHERE {
  VALUES ?ct { 1 2 } .
  ?item wikibase:statements ?ct;
        wikibase:sitelinks [];
        rdfs:label ?label FILTER( LANG( ?label ) = 'en' ) .
} LIMIT 5000
|columns=item,label
}}

{{Wikidata list}}
Matěj Suchánek (talk) 06:40, 15 June 2017 (UTC)
That works well @Matěj Suchánek:, but I also want to remove any item with instance of (P31) category/template/disambiguation.Now if you thought I forgot to sign it you are right. But don't you dare add {{unsigned}} yet! PokestarFan • Drink some tea and talk with me • Stalk my edits • I'm not shouting, I just like this font! 21:35, 15 June 2017 (UTC)
You are signed just below the headline... you can add
MINUS { ?item wdt:P31 wd:Q4167836 } .
MINUS { ?item wdt:P31 wd:Q11266439 } .
MINUS { ?item wdt:P31 wd:Q4167410 } .
to the query, just before } LIMIT 5000. Matěj Suchánek (talk) 07:38, 16 June 2017 (UTC)
Request process
This section was archived on a request by:
--- Jura 13:49, 7 July 2017 (UTC)

Area of some Spanish municipalities

Request date: 25 June 2017, by: Abián

Task description

I would like to import some data about the area of a few Spanish municipalities. However, QuickStatements (Q20084080) and QuickStatements 2 (Q29032512) seem not to be working well with real numbers (ping Magnus).

Could someone import these data with a bot please?

Regards, and thanks in advance. --abián 16:41, 25 June 2017 (UTC)

Discussion
Request process

Task completed (11:50, 17 July 2017 (UTC))

This section was archived on a request by: abián 11:49, 17 July 2017 (UTC)

Import from Pauly-Wissowa on dewikisource

Request date: 23 June 2017, by: Jonathan Groß

Link to discussions justifying the request
Task description

This task is similar to the one I requested for the Allgemeine Deutsche Biographie (ADB; see here on this page) a few weeks ago; however, this current request is different as it is going to involve re-runs on a regular basis.

German Wikisource has an ongoing project (started in 2008) to digitize the Paulys Realenzyklopädie der klassischen Altertumswissenschaft (Q1138524) (abbreviated as RE) one of the most important reference works for the classicists. Since this encyclopedia was published over a period of almost 90 years (from 1893 to 1980), several generations of scholars contributed to it (we're working on identifying them and verifying there dates on a subpage), and while some of them are still alive, a large part of the article's authors have died more than 70 years ago; hence, in the EU, their works are in the public domain.

We collect some valuable data from and metadata about the RE articles in the template REDaten, which in connection with the subcategories of the main project category can very well serve as a source for statements on Wikidata.

There are some things to note. First, there are (by our definition) three different kinds of articles: a) full articles, b) Verweisungen (redirects originally created by the RE editors), c) Nachträge (supplements to articles, mostly additions and amendments, sometimes replacement articles). Our policy is to create Wikisource pages for a and b (note: pages, not redirects), and add c where it was intended by the RE editors: Below the original articles (a), i.e. on the same Wikisource page.

As far as I see, there are two tasks that need to be done on a regular basis:

  1. Check s:de:Kategorie:Paulys Realencyclopädie der classischen Altertumswissenschaft for pages with no Wikidata item and create items for them
  2. Check items for pages from that category for consistency and add labels, descriptions, and statements to them.

The second task is definitely the more complex one, so I'll differentiate and try to be as specific as possible, using the RE article on Apollon as an example:

  1. Labels should be the original heading of the articles with a statement like "Pauly-Wissowa" in brackets. This can be done by taking the page name (e.g. "RE:Apollon", subtract the "RE:" part and add " (Pauly-Wissowa)", to form the label "Apollon (Pauly-Wissowa)".
  2. Descriptions should be something generic, something like "article in Paulys Realencyclopädie der classischen Altertumswissenschaft (RE)". For instances of a, the first part should be "article", for b it should be "cross-reference".
  3. Statements should correspond with infobox template parameters (Template:REDaten):
    1. IF VERWEIS=OFF IS TRUE THEN instance of (P31)encyclopedia article (Q17329259) ELSE instance of (P31)cross-reference (Q1302249).
    2. All items should have part of (P361)Paulys Realenzyklopädie der klassischen Altertumswissenschaft (Q1138524).
    3. title (P1476) can be the same as the label, but there is a problem with languages. Most lemmata are derived from Latin and Greek, but some are actually Greek or Latin themselves. Greek (to be precise, this means Ancient Greek (Q35497)) can be identified by the characters in the title (whenever they arae in Greek script), but Latin uses the same characters as German. This is something we still have to figure out.
    4. published in (P1433) can be inferred from BAND=, which adds a category as well. In our example, RE:Apollon has BAND=II,1, which puts it into Category:RE:Band II,1 = Pauly-Wissowa vol. II,1 (Q26414959). Usually, RE is cited with Roman numerals, but Arabic numerals are also in use, hence we should add both (maybe with a qualifier for the numeral system).
    5. publication date (P577) can be added like published in (P1433), since every volume has a specific date of publication.
    6. page(s) (P304) can be parsed from SPALTE_START= and SPALTE_END=.
    7. follows (P155) and followed by (P156) can be inferred from VORGÄNGER= and NACHFOLGER= respectively. If there is no such item, this statement should be left out completely (instead of making it no value).
    8. main subject (P921), arguably the most useful part of this, can be parsed either from WIKIPEDIA= or WIKISOURCE=.
    9. author (P50) can be taken from Template:REAutor at the bottom of the article. Sometimes there are multiple instances of this template in a single article. There is a Module:RE/Autoren which can help.

As far as I can think of, those are the things a bot could handle best. I'm sure there are some points that need further discussion. Please feel free to tell me your opinions and ask questions below. Jonathan Groß (talk) 13:33, 23 June 2017 (UTC)

@Pasleim, Tolanor, THE IT, S8w4, Pfaerrich: Looking forward to your input! Jonathan Groß (talk) 13:33, 23 June 2017 (UTC)

Discussion

To me this all looks fine! Like I said elsewhere however for point 8 (which is indeed the most important part of this) to work we need a cleanup initiative for the RE to Wikipedia links first. All links to WP-disambiguations and lists should be removed with the help of a bot (links to sites such das w:de:Ariobarzanes, w:de:Antoninus, or w:de:Antistius etc.). There's a non-current list of all links from RE to Wikipedia at User:Pyfisch/RE. I have already corrected many of them, but many remain. --Tolanor (talk) 19:13, 25 June 2017 (UTC)

@Tolanor: I don't think we need to clean up the WIKIPEDIA= links first. On the contrary: Once all those links are transferred to Wikidata, cleanup will be a lot easier because we can run queries for main subject (P921) linking to instance of (P31)Wikimedia disambiguation page (Q4167410), then we get lists of all RE articles linking to Begriffsklärungsseiten. Jonathan Groß (talk) 08:40, 26 June 2017 (UTC)
Okay, sounds good. --Tolanor (talk) 11:20, 26 June 2017 (UTC)
@Pasleim, Pyfisch:? I'm afraid we need someone with a bot here. --Tolanor (talk) 19:53, 4 July 2017 (UTC)
@Jonathan Groß, Tolanor: I started to programm it but I encouter problems in distinguishing articles from cross-references. Using the parameter VERWEIS is not an option because many pages don't use it but the category [[Kategorie:RE:Verweisung]] was manually added, for example in s:de:RE:Acilius 46. Relying on the category is also not an option because there are also articles in s:de:Kategorie:RE:Verweisung, for example s:de:RE:Augustum 1. Do you know a solution how I can figure out if a page is an article or a cross-reference? --Pasleim (talk) 20:13, 22 July 2017 (UTC)
@Pasleim: Gut, dass Du das aufbringst. Die Vorlage:REDaten wird offenbar nicht überall mit allen Parametern eingesetzt. Die Kategorie:RE:Verweisung ist also der beste Anhaltspunkt. Dass in dieser Kategorie auch ganze Artikel stehen, wie RE:Augustum 1 und [[[s:de:RE:Augustum 2|2]], liegt an unserer Entscheidung, Nachträge und Berichtigungen nicht auf eigenen Wiki-Seiten, sondern auf der Seite des jeweiligen Stichworts einzufügen. Augustum 1 und 2 sind nach unserer Herangehensweise beides, sowohl vollständige Artikel als auch Verweisungen auf andere Artikel. Die entsprechenden Wikidata-Items sollten also zwei P31-Statements erhalten, idealerweise mit applies to part (P518)Pauly-Wissowa vol. II,2 (Q26415062)-Qualifikator – wenn sich das per Bot machen lässt.
Wahrscheinlich ist es aber besser, diese Fälle von dem Botlauf auszusparen und besser von Hand zu machen. Denkbar wäre etwa, die Seiten in der Kategorie:RE:Verweisung, die größer als 500 Bytes sind, auf eine Wartungsliste zu setzen und dann jeweils zu prüfen, ob sie Hybride aus Artikel und Verweisung sind oder nicht. Jonathan Groß (talk) 10:57, 25 July 2017 (UTC)
Ok, ich habe das nun so implementiert. Hier sind ein paar Testedits. --Pasleim (talk) 15:36, 26 July 2017 (UTC)
Request process

under process --Pasleim (talk) 15:36, 26 July 2017 (UTC)

first complete run done. I will repeat it weekly. --Pasleim (talk) 19:25, 11 August 2017 (UTC)
This section was archived on a request by: Pasleim (talk) 19:25, 11 August 2017 (UTC)

P18 imports from it.wiki infoboxes for buildings

Request date: 20 June 2017, by: Nemo bis

Link to discussions justifying the request
  • Thanks to [1] I noticed that a lot of items about Italian places lack a image (P18), which makes it hard to find the items which actually need me to shoot/look for a photo.
Task description
  • Take all the templates in w:it:Categoria:Template sinottici - architettura and all their transclusions.
  • Take the filename passed as "Immagine" parameter and add it to the Wikidata item as P18 if there is no P18 yet.
  • Create the item if missing.

--Nemo 20:00, 20 June 2017 (UTC)

Licence of data to import (if relevant)

No database rights in USA, not copyrightable

Discussion

You can do it in self-service using https://tools.wmflabs.org/pltools/harvesttemplates/ Doing that for you on all templates --Teolemon (talk) 13:58, 21 June 2017 (UTC)

I could but I don't think it's the best way to import tens of thousands of pages. --Nemo 17:41, 24 June 2017 (UTC)
Maybe ✓ Done, some 977 pages done. But it was quite clunky (couldn't do more than one request at a time to avoid gateway errors) so this may have missed something. --Nemo 21:08, 24 June 2017 (UTC)
I rerun it but couldn't find any missing images --Pasleim (talk) 17:16, 23 August 2017 (UTC)
This section was archived on a request by: --Pasleim (talk) 17:16, 23 August 2017 (UTC)

Importing area_imperial from geobox settlement into area (P2046)

Request date: 27 June 2017, by: ChristianKl

Task description

I did recently request a query on the density of various cities. While doing that, I discovered that many cities don't have area-data inside Wikidata but have area-data on enwiki. Naperville (Q243007) is a good example. Importing area_land_imperial and area_water_imperial into area (P2046) along with applies to part (P518) would also be great. ChristianKl (talk) 21:59, 27 June 2017 (UTC)

There's also "Infobox settlement" which has area_total_sq_mi, area_land_sq_mi and area_water_km2 which can be imported. ChristianKl (talk) 22:18, 27 June 2017 (UTC)

Discussion
I notice the "Infobox settlement" will display converted values if the imperial value is present and the SI value is absent, or vice versa. It would seem that from Wikipedia's point of view, it would be better to specify only one of the values and let the other be computed, to avoid the accidental creation of inconsistent values. I don't know how this would affect the import process. Jc3s5h (talk) 07:59, 14 July 2017 (UTC)


Request process

have started the import with HarvestTemplates but now stopped due to complaints on Topic:Tu9am4jyv67ga4tf. --Pasleim (talk) 07:40, 14 July 2017 (UTC)

✓ Done for all settlements in the U.S. Where possible I used data from 2016 U.S. Gazetteer Files (Q32859555) (in km^2). If data was not avaiable from that source I harvested the infobox. --Pasleim (talk) 17:19, 23 August 2017 (UTC)
This section was archived on a request by: --Pasleim (talk) 17:19, 23 August 2017 (UTC)

BBF ID

Request date: 30 June 2017, by: Jonathan Groß

Link to discussions justifying the request
Task description

In May 2017, the Research Library for the History of Education (Q856552) introduced their new database. The old IDs are now obsolete. On my request, their technicians have been kind enough to scrape the new IDs from the old p-strings, and created a matching list of Wikidata Items, old BBF IDs, and new BBF IDs, available here.

I think a bot should replace the old p-strings in these 637 items with the new IDs. Jonathan Groß (talk) 08:41, 30 June 2017 (UTC)

Discussion
Request process

This should not have been done. The old IDs are data, and data is not something which we should discard. How - for example - is someone holding the old ID in their database now supposed to resolve it, using Wikidata? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 10:09, 3 July 2017 (UTC)

@Pigsonthewing: these are simple IDs. The old BBF IDs became useless after they were replaced by the new ones in the source website. The URLs generated by property were non-functional with old IDs, and after the property was updated, those old IDs didn't matched anymore the regex format of the allowed values: /[0-9a-f]{8}(-[0-9a-f]{4}){3}-[0-9a-f]{12}/. Cc Jonathan Groß XXN, 10:39, 3 July 2017 (UTC)
I understand all that; I don't agree that the IDs became "useless" as data when the URLs changed. I note that you did not address my question. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:04, 3 July 2017 (UTC)

@Pigsonthewing: The old IDs may be used somewhere (even in printed publications), but they are useless now. It took the BBF technicians a week of work to identify our 637 entries in their old database using the p-strings, since (as they told me) these were never intended to serve as permanent identifiers. The current IDs are stable, though, and replacing the outdated strings with actual identifiers serves everybody best. Jonathan Groß (talk) 11:06, 3 July 2017 (UTC)

No: they are not "useless", they still identify the same subjects that they did previously. Replacement does not serve the user in my question (which you have also not answered), or someone suing printed material such as that to which you refer, at all, much less serve them "best". Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:40, 3 July 2017 (UTC)
How do they effectively identify anything, since they're not used in any accessible database? Jonathan Groß (talk) 17:01, 3 July 2017 (UTC)
The last time I looked, Wikidata was an accessible database; and until this unfortunate bot job, the IDs were used in it. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 20:29, 3 July 2017 (UTC)
@Pigsonthewing: Do you have reason to believe that the old IDs are used elsewhere? ChristianKl (talk) 23:28, 3 July 2017 (UTC)
Even if they are, there is no way to use the old IDs for anything now, not even for identification, since they cannot be checked. The old database was taken down and replaced, so the 630-something old IDs that Wikidata (and Wikipedia) used until June 2017 are obsolete. Their replacement doesn't mean that they are not available (there is still this file). It's just that the old IDs are of no use to anybody anymore. I don't know how widely the BBF archival database was used until the database overhaul in May 2017. I know of a few printed publications and websites that refer to it, in some rare instances using p-strings as IDs, but mostly just with words such as "see BBF / DIPF archival database, Personalblatt XYX (accessed 35 May 2010)". Whatever the case, as long as these publications refer to specific archival matter, it is easily possible to find the new identifiers for said matter in the new database, without recourse to any obsolete URLs or IDs. As far as I am concerned, we can happily part with the obsolete identifiers. Jonathan Groß (talk) 06:47, 4 July 2017 (UTC)
"there is no way to use the old IDs for anything now" Indeed so, whereas if they were kept in Wikidata there would be. That's why this bot job was harmful. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 09:55, 4 July 2017 (UTC)

General comment: AFAIK we do not have an external identifier help page, thus I suggest to draft one in response to this incident which also covers handling of obsolete identifiers and database handles. Situations such as this one have not happened that often until now, but we have to expect more cases in the future. A defined transient process from old to new identifiers would be useful, and this could take into account that old identifiers might be useful for a while after obsolescence, but we probably do not want to pile up plenty of them forever (“BBF ID (old)”, “BBF ID (old2)”, “BBF ID (very old)” and so on…). —MisterSynergy (talk) 10:11, 4 July 2017 (UTC)

The thing is also that having a link with the old identifier is useless since it's leading to a 404 page. If the old identifiers are stored, then only as text, without any auto-generated link. Such solution could also be useful for databases where entries are deleted after, e.g., a person has died (FIDE etc.). On could implement this non-linked IDs either as a new configuration or somehow connect it with the "Deprecated" setting. Steak (talk) 11:48, 4 July 2017 (UTC)
This section was archived on a request by: --Pasleim (talk) 17:15, 23 August 2017 (UTC)

Videogame descriptions

Request date: 18 June 2017, by: PokestarFan

Link to discussions justifying the request
Task description

Use a bot to add a predifned list of descriptions to any item with instance of (P31), and only instance of (P31) video game (Q7889).

List
	'en':'video game',
	'en-ca':'video game',
	'en-gb':'video game',	
	'es':'videojuego',
	'de':'Videospiel', 
	'el':'βιντεοπαιχνίδι',
	'fr':'jeu vidéo',
	'pl':'gra wideo',
	'fi':'videopeli'
Licence of data to import (if relevant)
Discussion

Are you sure they are correct? MechQuester (talk) 19:41, 20 June 2017 (UTC)

Request process
  • Starting from these, I've added a couple more descriptions including release years, for a part of items. Some 12000 items (tinyurl .com/yd6ey9nt) don't have publication date, and I'm in doubt to add or not such simple descriptions (anyway for a part of them there will arise API conflicts of non-unique label+description and the bot will fail to do needed changes); thus maybe it's better to wait until these items will have publication date (P577). XXN, 19:29, 21 June 2017 (UTC)
This section was archived on a request by: ChristianKl () 14:59, 4 December 2017 (UTC)| We already banned PokestarFan who created it.