User talk:MrProperLawAndOrder

From Wikidata
Jump to navigation Jump to search
Logo of Wikidata

Welcome to Wikidata, MrProperLawAndOrder!

Wikidata is a free knowledge base that you can edit! It can be read and edited by humans and machines alike and you can go to any item page now and add to this ever-growing database!

Need some help getting started? Here are some pages you can familiarize yourself with:

  • Introduction – An introduction to the project.
  • Wikidata tours – Interactive tutorials to show you how Wikidata works.
  • Community portal – The portal for community members.
  • User options – including the 'Babel' extension, to set your language preferences.
  • Contents – The main help page for editing and using the site.
  • Project chat – Discussions about the project.
  • Tools – A collection of user-developed tools to allow for easier completion of some tasks.

Please remember to sign your messages on talk pages by typing four tildes (~~~~); this will automatically insert your username and the date.

If you have any questions, don't hesitate to ask on Project chat. If you want to try out editing, you can use the sandbox to try. Once again, welcome, and I hope you quickly feel comfortable here, and become an active editor for Wikidata.

Best regards! From Hill To Shore (talk) 17:21, 4 April 2020 (UTC)[reply]

Wikidata:Administrators' noticeboard‎‎ #‎Wholesale P21. Incnis Mrsi (talk) 06:59, 7 April 2020 (UTC)[reply]

I made four errors, sorry. MrProperLawAndOrder (talk) 20:56, 7 April 2020 (UTC)[reply]
Hi. Can you please explain your basis for claiming Hyacynthus Michel (Q55856596) was male? I've checked all of the listed IDs and none of them mention gender. The person was a professor but there had been female professors since the 1700s, so that isn't a guarantee of gender. Do you have an alternate source? From Hill To Shore (talk) 18:39, 8 April 2020 (UTC)[reply]
(talk page stalker) @From Hill To Shore: bios of female professors look like that. IMHO it is safe to assume that a person whose profession (Beruf) is stated with German male nouns (such as ending on -r) is a man. “Hyacynthus” is a false alarm. Incnis Mrsi (talk) 19:55, 8 April 2020 (UTC)[reply]
@Incnis Mrsi: Okay, makes sense. One of the drawbacks of Wikidata is that the interface doesn't ask you for edit summaries by default, so you can't explain the basis for your actions. From Hill To Shore (talk) 21:25, 8 April 2020 (UTC)[reply]

Hi,

About [1] and [2], unfortunately the NIOSH Power Tools Database doesn't differentiate rotary hammer (Q1932875) from hammer drill model (Q23811264).

The way to differentiate them is that rotary hammer (Q1932875) uses SDS (Q458278). They should also be correctly categorized on Commons, so you can check if the images are in commons:Category:Rotary hammers or commons:Category:Hammer drills. The RedBurn (ϕ) 07:53, 1 May 2020 (UTC)[reply]

The RedBurn, when the items about instances of models were created they all referred to a class with the English label "hammer drill", matching the description, but later someone changed the English label [3]. But there is another item, labelled "hammer drill", I now made the models subclasses of that one, restoring the meaning of the original claim, matching the description. matching what was stored in instanceOf, namely instanceOf hammer drill model and matching the reference NIOSH - I checked some of the linked PDFs, stored in Commons for that, which also said "hammer drill". MrProperLawAndOrder (talk) 14:46, 1 May 2020 (UTC)[reply]

Smartphone model[edit]

Hi. What was your basis for changing smartphone model (Q19723451) to product model (Q10929058) in so many smartphone articles? Has there been discussion about this somewhere? Kissa21782 (talk) 07:58, 2 May 2020 (UTC)[reply]

Wikidata:Project chat#Android smartphone model MrProperLawAndOrder (talk) 08:00, 2 May 2020 (UTC)[reply]
Thanks. --Kissa21782 (talk) 18:23, 2 May 2020 (UTC)[reply]
see here Germartin1 (talk) 07:55, 18 June 2020 (UTC)[reply]

Extract GND from VIAF[edit]

@Bargioni: said me to tell you that he acted as follows:

  • saved the first two columns of https://w.wiki/QmY in a file named missing_GND.tsv
  • he launched
perl -lne 'use JSON; chomp; ($q,$v)=split/\t/; $js=`curl -s $v/justlinks.json`; $j=from_json($js); print $q."\t$v\t". join ", ",@{$j->{DNB}}; sleep 1;' missing_GND.tsv > gnd.qs

If you understand it better than me, it might be useful. Good evening, --Epìdosis 17:44, 14 May 2020 (UTC)[reply]

@Bargioni, Epìdosis, Kolja21: - this is great news guys! Great it can be done via commandline and no need to download whole VIAF DB. I cannot write such perl stuff, but maybe one day can at least read and understand. MrProperLawAndOrder (talk) 17:48, 14 May 2020 (UTC)[reply]
@MrProperLawAndOrder: Please, note that I modified a bit the query https://w.wiki/QmY to list only ?item ant ?viaflink. They match $q and $v in the mini Perl script. Note also that the script output doesn't include P214 that must present in the second column, and a reference to VIAF, like S248 TAB Q54919 TAB S813 TAB +2020-05-14T00:00:00Z/11 TAB S214 TAB "[viafid]". Both were added by Epìdosis. HTH. -- Bargioni 🗣 07:08, 15 May 2020 (UTC)[reply]

Guess there are a few duplicates in that batch. Cheers, Braveheart (talk) 13:08, 19 May 2020 (UTC)[reply]

Braveheart. thank you! Will fix everything immediately and not start new runs before finding the reason. MrProperLawAndOrder (talk) 14:26, 19 May 2020 (UTC)[reply]

https://tools.wmflabs.org/quickstatements/#/batch/34248

same batch, created immediately after each other. In my local qs list I cannot see a duplication for these and the online version doesn't show the statements to me at the moment. Could be a bug in QS. MrProperLawAndOrder (talk) 14:41, 19 May 2020 (UTC)[reply]

That is indeed very odd indeed. Is there a way to list all the cases where an ID of a propery is used more than once? Best, Braveheart (talk) 15:28, 19 May 2020 (UTC)[reply]
Braveheart, on the P7902 talk there is the distinct value constraint and I used the data to create a batch to merge. batch SPARQL distinct value violation . Thanks again that you made me aware of the bug. Will try to test when it happens. But first fixing all. Sparql has some delay, so I have to wait for the next check. MrProperLawAndOrder (talk) 15:37, 19 May 2020 (UTC)[reply]
I had just noticed another example: Wilhelm Modersohn (Q94785554) and Wilhelm Modersohn (Q94785557). The constraint mechanism is (correctly) complaining that these have identical GND and Deutsche Biographie IDs. (And are the GND and Deutsche Biographie IDs really identical?)Scs (talk) 17:22, 19 May 2020 (UTC)[reply]
Scs, thank you. Yes DtBio and GND are equal. I will use sparql and batch to remove the duplicates. QS seems to be buggy. MrProperLawAndOrder (talk) 17:31, 19 May 2020 (UTC)[reply]
(edit conflict) More examples:
I plucked these out of your recent contributions list; there are lots more. Clearly some part of your importation process is not properly weeding out duplicates. —Scs (talk) 17:35, 19 May 2020 (UTC)[reply]
Scs, well, how do you know that? I have no evidence for what you claim in your last sentence. I have the impression the process is creating the duplicates, and I have no control of the QS tool code. MrProperLawAndOrder (talk) 17:40, 19 May 2020 (UTC)[reply]
Yes, this has happened to other users recently as well. I suspect some flaw in the QS tool, but at this point it is not completely clear what is on here. —MisterSynergy (talk) 17:44, 19 May 2020 (UTC)[reply]
MisterSynergy, thank you for the confirmation. I had suspicious QS edits last week, outside item creation, when inserting external id property claims, but I was not sure if I made a mistake. Regarding my current edits: I will take great care to merge all duplicates that I create. @Kolja21: sorry, Property talk:P227/Duplicates section human will be a bit full from time to time. Will try my best to minimize the impact on the usability of that page. MrProperLawAndOrder (talk) 17:57, 19 May 2020 (UTC)[reply]
@MrProperLawAndOrder: I'm not sure what's going on, either. My claim about "some part of your importation process" was a broad one, potentially encompassing QuickStatements itself. I did wonder if, perhaps, your input file contained duplicate rows which you were expecting QuickStatements to detect and weed out. Are you saying that you've confirmed that your input does not contain these duplicates? If so, then as MisterSynergy suggests, it's sounding like there's something wrong with QuickStatements itself, which is surprising, but I guess not impossible. —Scs (talk) 18:37, 19 May 2020 (UTC)[reply]

@Scs: can now confirm, only one line Ströber in my csv

02:28, 22 May 2020 diff hist  +1,927‎  N Konrad Ströber (Q95005332) ‎ ‎Created a new Item: batch #34502 current Tag: QuickStatements [1.5]
02:28, 22 May 2020 diff hist  +1,927‎  N Konrad Ströber (Q95005331) ‎ ‎Created a new Item: batch #34502 current Tag: QuickStatements [1.5]
02:28, 22 May 2020 diff hist  +1,927‎  N Konrad Ströber (Q95005330) ‎ ‎Created a new Item: batch #34502 current Tag: QuickStatements [1.5]

and only one batch running. Several such cases of triple items. MrProperLawAndOrder (talk) 02:36, 22 May 2020 (UTC)[reply]

Hello @MrProperLawAndOrder:

from my point of view there a some problems with these newly created objects which have been created during the last days.

1) a problem with Quickstatements (?!?) seems to create duplicates, in some cases also three or four objects for the same GND. They have to be merged based on the same GND afterwards. I would be better to avoid creating duplicates in the first place.

2) Objects which already existed before are not taken into account, for example:

They have to be merged manually (based on the GND when trying to harvest the GND from the german language wikipedia, which fails with a "unique constraint violation"), since they did not have a common ID before.

3) In the past, the bot (User:Pi bot) operated by @Mike Peel: has been able to assign newly created articles to existing wikidata objects automatically. I am not sure if this is possible with the newly created objects anymore in the future, I am afraid, this has to be done manually in the future, e.g. also due to the lack of birthyear/deathyear in the newly created objects.

From my point of view, as long as these problems has not been discussed and solved, the mass creation of GND-objects should be stopped. Thanks a lot! --M2k~dewiki (talk) 14:55, 23 May 2020 (UTC)[reply]

@M2k~dewiki: I told you on you talk that I merge the GND DtBio duplicates via batch, other clean up I can also do and have done before. See Property talk:P227/Duplicates I brought it down very fast. MrProperLawAndOrder (talk) 15:23, 23 May 2020 (UTC)[reply]
Hello @MrProperLawAndOrder: from my point of view, merging duplicates afterwards is only a workaround, not an actual solution to the initial problem (e.g. in Quickstatement?). It would be better to avoid creating duplicates in the first place and solve the root cause problem before. In addition, this does not address 2) and 3). For example, would it be possible to add further properties like year of birth/death (country of citizenship, occupation, ...) to these objects when creating them? It think this might help Pi bot to automatically assign newly created articles to these GND-objects. --M2k~dewiki (talk) 16:34, 23 May 2020 (UTC)[reply]
@M2k~dewiki: Cannot fix QS, if someone fixes it, great, but the only problem is wasting qids. GND is CC0, so one can import data from there including year of birth/death. As soon as someone creates a new item and adds the GND ID they will get a warning in case it exists and merge. I will write more at Property talk:P227#Create GND humans from Deutsche Biographie. MrProperLawAndOrder (talk) 18:10, 23 May 2020 (UTC)[reply]
Regarding Quickstatements, also see https://phabricator.wikimedia.org/T234162 --M2k~dewiki (talk) 17:40, 24 May 2020 (UTC)[reply]

Please respect the maxlag=5 policy[edit]

Your bot is creating MANY MANY MANY item, while all other scripts are put on hold. This is only increasing the lag on the replication servers more. I see since yesterday a HUGE increase of newly created items, all created by your script. Please revert the default setting of maxlag=5 in your user-config.py and restart your script(s), in order to make this effective. Right now your script is increasing the lag, making that all other script are halted most of the day. Edoderoo (talk) 08:13, 20 May 2020 (UTC)[reply]

@Edoderoo: can you add references to your claims? Where is my bot? Where can one see "HUGE increase of newly created items, all created by your script"? No one else created new items in that time? All other scripts on hold? user-config.py? MrProperLawAndOrder (talk) 11:32, 20 May 2020 (UTC)[reply]
The number of items created at least doubled in the last 24 hours, usually 99% of [items] are created by User:LargeDatasetBot and User:Ghuron. It turned out that you started too many parallel QuickStatement-runs, that caused the overflow of new items, and QS does not have any mechanism to prevent flooding :-( If you could at least not start so many batches in parallel, it would be appreciated.
About stats: I collect some data from my script here, that shows a sudden increase in processing due to many new items suddenly around 15:00.
By the way, on the Telegram group people were complaining that your items are missing a lot of claims, that are available in the sources. Are you planning to add those (VIAF, birthdates, etc), as this will improve the quality of the items, making them more meaningful for searches. Others were telling that you were also creating duplicates, but I don't know any example. Edoderoo (talk) 12:12, 20 May 2020 (UTC)[reply]
@Edoderoo: thank you! Re dupl: see thread above, QS bug, all merged via another QS batch, re enriching: Property talk:P227#Create GND humans from Deutsche Biographie - VIAF for each, date of birth/death depending on source quality. DtBio is huge aggregator of other GND human content, so the items created will be useful for interlinking information. Re QS runs: I couldn't import more than ~5000 into one batch, so I had to go through "New batch" and copy paste subsets of my statement list several times, press import, press run in background because only so the batch is actually created and given an id. I think I couldn't press stop at that time, which can only be done when actually running and not in init. Will try to make sure to press stop next time. And I should have a look into how to run a proper bot, then there should be no problems at all! MrProperLawAndOrder (talk) 13:17, 20 May 2020 (UTC)[reply]
A proper bot will require some programming skills ;-) If WikiData would get a large amount of money to upscale their hardware, it would make it easier for all of us ;-) So far we just should be thankful to WMDE and/or WMF that we have what we have. I know people work hard behind the scenes to make things for the better. Edoderoo (talk) 13:26, 20 May 2020 (UTC)[reply]

Edoderoo, now I had only one batch running [4] and 90 new items per minute, which was the speed last time. So even with one batch this is 90. Was 90/min or 6 batches running at a time a problem? Is hardware really that limited? If there would be six people like me, and each has a sixth of what I have now, then the problem would be the same, not? Or was there another problem that you had a bot running after my edits, adding Lnl. This time I have included Lnl. The 90 per min included the duplicates, if they wouldn't be created by QS, I would have fewer edits in total for creation and no edits for merging. This also increases hardware usage. MrProperLawAndOrder (talk) 02:54, 22 May 2020 (UTC)[reply]

There are several answers to this. Tools like QS and PetScan can indeed create issues when they are used with large batches. If only one user is using that, the hardware can handle that. But the last weeks, one bot was continuesly importing scientific papers and scientists, while another was importing galaxy systems. They both put the system against it limit already, then your large batch was pushing it just more. And because QS does not care about any limit, it will only push it further. That is actually something that should be solved in QS, you can't do much more then knowing that those limits exist, and avoid running too many parallel batches. I know that it feels stupid, I have the same going on. For now 5 years I run my scripts to add descriptions in Dutch, and I have always 10x more work ahead then what I have done, and there is no way to catch up. When I add 1M descriptions, other users added 2M scientific papers. The good news is that we will have a hobby for sure the coming 20 years ;-) Edoderoo (talk) 04:27, 22 May 2020 (UTC)[reply]

GND DtBio human creations that already had an item about the human without GND[edit]

changed headline from "Additions" to reflect the issue raised MrProperLawAndOrder (talk) 15:16, 27 May 2020 (UTC)[reply]

Hi there,

many thanks for your various additions. But please make sure not to create duplicates, like Q33104687/Q95239556 =).

Best, Nomen ad hoc (talk) 12:10, 27 May 2020 (UTC).[reply]

@Nomen ad hoc: thank you for spotting this duplicate among the ~175000 newly created items. The creations all have GND ID P227 and DtBio ID P7902, details, also about their importance, at Property talk:P227#Create GND humans from Deutsche Biographie. Enriching with various details is planned, VIAF ID and time information is high on the agenda, especially the VIAF ID will help in checking for more duplicates via SPARQL. I will take care of checking [I will immediately write a query after saving this message and share it at P227 talk]. I checked all human items against existence of GND and DtBio. If you know a way of doing what you ask me for, that requires less work than merging afterwards, please share it. Before your merge the item didn't have any identifier from VIAF, after merge it had GND, later you added VIAF. I don't know how one could have known via SPARQL before. MrProperLawAndOrder (talk) 15:14, 27 May 2020 (UTC)[reply]
@Nomen ad hoc: without your post here, I wouldn't have done it now. [5] ... Several issues already now, the recently created items mostly have no VIAF in WD yet. Lot of work ahead. MrProperLawAndOrder (talk) 16:25, 27 May 2020 (UTC)[reply]

Worldcat Identity vs VIAF[edit]

Hi! About your reverts like this one: https://www.wikidata.org/w/index.php?title=Q1605162&oldid=prev&diff=1183918664. I agree with removing the Worldcat Identity, but why did you keep the VIAF id? Worldcat is derived from the VIAF record, so it's very possible someone will add it again on a future refresh.

And please add these errors to https://en.wikipedia.org/wiki/Wikipedia:VIAF/errors#WorldCat_Identities_errors so that OCLC can fix them some day. Thanks --Vladimir Alexiev (talk) 08:15, 29 May 2020 (UTC)[reply]

Vladimir Alexiev, I didn't keep it, but removed it in the next edit [6]. Re "Worldcat is derived from the VIAF record" - But in that case the WC link was lccn-based, do you know why?
@MrProperLawAndOrder: The Worldcat page is made by the curators of VIAF (OCLC) and those links come in VIAF dumps, no matter whether they use VIAF id or LCCN id as a basis.
Vladimir Alexiev, that doesn't answer the question, which asked for "why" it is lccn-based.
Re reporting: In this case I see no evidence VIAF made an error. The VIAF came from dewiki. Also, as mentioned in the edit summary of each of the two removal edits that I made, the type was wrong, would OCLC assign between wrong types, is there a chance for that kind error of error in OCLC? MrProperLawAndOrder (talk) 13:27, 29 May 2020 (UTC)[reply]
Yep. They have one record for a person and his atelier/workshop/architectural studio. You say there should be two. So I don't understand what you mean by "no evidence VIAF made an error" --Vladimir Alexiev (talk) 08:56, 2 June 2020 (UTC)[reply]
Do they have one for both as you claim? My edit summary in each removal was "wrong type - id for corporate, but type is human" - the ids were for the corporate when I checked. I removed the ones for the corporate. Re /I don't understand what you mean by "no evidence VIAF made an error"/ I just see none. And you didn't show it. MrProperLawAndOrder (talk) 09:40, 2 June 2020 (UTC)[reply]

Quality of data imports[edit]

If someone complains about the quality of data imports he should take a look at the 1.6 million ORCID IDs. I've just clicked on "C Frank" (25. Jan. 2020‎, LargeDatasetBot) and got the result "Frank Schreiber". Most of these researchers are irrelevant for Wikipedia and in the few cases they are the imports are often duplicates. --Kolja21 (talk) 06:40, 30 May 2020 (UTC)[reply]

Kolja21, interesting example. My first edit on an item created as ORCID item was yesterday, when I restored the original meaning [7]. Someone had converted an ORCID person to a DtBio person and I found the item, because sex was missing. Re "irrelevant for Wikipedia" - did you mean WP or WD? I am fine if they are in WD, it will help to show the relevance and quality of ORCID. I read VIAF removed ORCID as source. My focus currently is on VIAF-related data and therein GND humans. MrProperLawAndOrder (talk) 10:37, 30 May 2020 (UTC)[reply]
Irrelevant for Wikipedia. For Wikidata ORCID imports are useful but exhausting to enrich. BTW: If you focus on VIAF-related data. Do you know how many VIAF IDs contain GND IDs that not have been imported to WD yet? --Kolja21 (talk) 16:17, 30 May 2020 (UTC)[reply]
Kolja21, Re ORCID: yes. Re VIAF/GND no. I thought: Step 1: add VIAF for all GND humans - this process is started. Step 2: Resolve VIAF duplicates in WD on humans if one of the duplicates has a GND - I posted four queries for this on P227 talk. ... The next VIAF dump should appear next week, one could download, import into a DB, download WD's (human) VIAF IDs that have no GND, import into the same DB and then use SQL to find GNDs. ... Another idea: download GND dump, it also contains VIAF IDs. But that is probably best done in June/July when the next dump appears. ... One could then also look at the relationships in the dump, and add relatives to GND humans in WD. Ideally that would be done on WMF servers. WD needs better software/hardware to process imports. And better tools for visualization, SQID and Reasonator are inspirations, but have several problems. Projects that only visualize some WD data take away money from general-purpose projects for visualization, e.g.meta:Grants:Project/Maximilianklein/WHO and I have seen no evidence that this is to the benefit of WD. It's all just items and properties, why finance a tool that only works to show data that is of interest to some? And the proposers are proud to have written other tools before, for similar stuff that doesn't work anymore or not good enough or overlaps, Now looking again meta:Grants:Project/Maximilianklein/WHO#Project_goals ... "Which women are represented in French Wikipedia but not English Wikipedia?" ... did they ever use SPARQL? MrProperLawAndOrder (talk) 17:06, 30 May 2020 (UTC)[reply]
Example Elmar Roots (Q12361963): Thierry made the right guess although the year of birth is different on DtBio (1900 vs. 1906).[8] The item had no GND even though it was linked to VIAF 69817259 and even more GND 122701941 contains a link to Wikipedia. --Kolja21 (talk) 16:40, 30 May 2020 (UTC)[reply]
Kolja21, interesting, will think about it. MrProperLawAndOrder (talk) 17:06, 30 May 2020 (UTC)[reply]
@Kolja21: I perfectly agree about ORCID import, useful but really exhausting to enrich. BTW, I want to report you (probably you already know) two useful pages, Wikidata:WikiProject Authority control/VIAF errors to report VIAF errors and Wikidata:WikiProject Authority control/VIAF linking to multiple items to check cases where VIAF links to two items (they either need merging or reporting in VIAF errors). Good night, --Epìdosis 23:26, 30 May 2020 (UTC)[reply]

Company, male, born 1972 [9] MrProperLawAndOrder (talk) 03:53, 31 May 2020 (UTC)[reply]

Ongoing import from GND[edit]

Hi! Today I've started importing date of birth (P569)/date of death (P570), VIAF ID (P214) and ISNI (P213) from GND (thanks to @Bargioni:, we are dividing the batches between us two). Please keep an eye on the duplicates which will gradually come out thanks to VIAF ;-) Bye, --Epìdosis 15:50, 1 June 2020 (UTC)[reply]

Check a merge[edit]

Hi MrProperLawAndOrder, could you pleae check this merge? Same name and year of birth but another place of birth and per GND 134480112 it is a musician. Thank you. Raymond (talk) 17:48, 3 June 2020 (UTC)[reply]

Talk:Q4707481#GND MrProperLawAndOrder (talk) 23:07, 3 June 2020 (UTC)[reply]

Same for this merge. Different persons, compare GND 120483467 with GND 134211960. Thanks. Raymond (talk) 18:07, 3 June 2020 (UTC)[reply]

Raymond, undone. How did you find these? I had two batches merging on same English label, VIAF, year of birth and where the newer item had an ID >Q90000000 and DtBio. If it would be only two errors that would be great, but I fear there are more. Another 600 candidates https://w.wiki/Sns MrProperLawAndOrder (talk) 23:44, 3 June 2020 (UTC)[reply]

Hi MrProperLawAndOrder, thank you for your quick revert. In the German Wikipedia we have a maintenance category de:Kategorie:Wikipedia:GND in Wikipedia weicht von GND in Wikidata ab which shows discrepancies of GNDs between the local GNDs and the GNDs on Wikidata. Most discrepancies result from real dupe GNDs. We report these dupes to the DNB (de:Wikipedia:GND/Fehlermeldung/Juni 2020) and after the DNB merged the dupes we take over the "winning" GND to Wikipedia and Wikidata. I am checking new entries on the maintenance category more or less every day. This way I found the wrong merges yesterday. Raymond (talk) 06:32, 4 June 2020 (UTC)[reply]

Raymond, thank you. That means the target of the merge had the GND which is in Wikipedia, and the source had another. I will exclude such cases from the query - even in the helpful scenarios, finding a real duplicate, it isn't needed, since that can be checked through SPARQL or Listeria lists - and worse, if there is no dewiki article, it will not be found so easily. It were just so many that I decided to do a batch merge. The 600 from above was maybe rounded, but now the number increased to 792 new candidates. It's increasing, because Epìdosis/Bargioni are adding VIAF and birthday to items that have DtBio. MrProperLawAndOrder (talk) 07:51, 4 June 2020 (UTC)[reply]

GND human - merge candidates[edit]

Could you have a look at these?

Thanks, --Epìdosis 22:46, 9 June 2020 (UTC)[reply]

Just to be more precise: the previous 10 couples were just a sample of the 184 results of the following query I've just managed to construct:

#Same GND in two items, deprecated rank in the first, normal rank in the second
SELECT ?item1 ?item2 ?gnd ?msLabel
WHERE {
 { SELECT ?gnd
 WHERE {
 ?item p:P227 ?stdep .
 ?stdep wikibase:rank wikibase:DeprecatedRank .
 ?stdep ps:P227 ?gnd .
 } }
 ?item1 p:P227 ?stdep .
 ?stdep wikibase:rank wikibase:DeprecatedRank .
 ?stdep ps:P227 ?gnd .
 OPTIONAL { ?stdep pq:P2241 ?ms . }
 ?item2 wdt:P227 ?gnd .
 SERVICE wikibase:label { bd:serviceParam wikibase:language "de,en". }
}
ORDER BY ?msLabel
Try it!

Good night, --Epìdosis 22:57, 9 June 2020 (UTC)[reply]

User:Epìdosis thank you for the explanation, the first is already merged. Seems GND merged two items, now this is done in WD - perfect. Some in your sample have the same qid for left and right. You see the ones that are merged if you have redirect coloring, mines are green, didn't like the orange. MrProperLawAndOrder (talk) 23:01, 9 June 2020 (UTC)[reply]

About the placement of ISNI[edit]

I personally think that opening a discussion about ISNI (P213) being placed in ISO section before VIAF ID (P214) doesn't require ISIN (P946) being already in ISO section. As all ISO IDs except ISIN are in the aforementioned section and ISIN will soon be there, I frankly see no impediment to the opening of such discussion about ISNI. Bye, --Epìdosis 11:16, 10 June 2020 (UTC)[reply]

User:Epìdosis it's just easier, I didn't say require. I would like to have as much done before bothering a wider audience with the ISO ID section. Some people lack imagination, ask things that never would have come to my mind. So, make everything smooth and easy *before* starting a discussion with unknown audience. Just experience of life. MrProperLawAndOrder (talk) 14:23, 10 June 2020 (UTC)[reply]

Implementing ISNI 16-digit no space[edit]

The ISNI format has been changed. Before it was with spaces, now, it is without. But it is impossible to save without spaces. There always comes a message about detection of a harmful ISNI which does not allow to save without spaces. --Christian140 (talk) 06:53, 12 June 2020 (UTC)[reply]

@Christian140: Don't worry, the change has been reverted until there is clear consensus for it and until, if so, the abuse filter has been edited in order to allow inserting values without spaces. --Epìdosis 12:19, 12 June 2020 (UTC)[reply]
@Christian140, Epìdosis: I acted following the consensus. I already asked Epìdosis to adjust the abuse filter. A transparent documentation created by me can be found at Property talk:P213#Implementing ISNI 16-digit no space. MrProperLawAndOrder (talk) 17:05, 12 June 2020 (UTC)[reply]

Moved to Talk:Q96107618, will add more. MrProperLawAndOrder (talk) 19:29, 13 June 2020 (UTC)[reply]

June 2020[edit]

You have been blocked temporarily for abuse of editing privileges. Once this block has expired, you are welcome to make useful contributions. If you believe this block is unjustified, you may contest it by editing this page and adding the following template with a suitable reason: {{unblock|1=<the reason for your unblock request>}}. If you are logged in, and the option has not been disabled, you may also email the blocking administrator (or any administrator from this list) by using this form. See Wikidata:Guide to appealing blocks for more information.

I blocked your for 3 days for personal attacks at WD:AN--Ymblanter (talk) 19:11, 17 June 2020 (UTC)[reply]
Ymblanter, "19:10, 17 June 2020 Ymblanter talk contribs blocked MrProperLawAndOrder talk contribs with an expiration time of 3 days (account creation disabled) (Intimidating behaviour/harassment: personal attacks)" - could you please provide a diff(s) and specify which part or parts of my text in that was classified as "personal attacks". I really would like to avoid doing anything that would constitute a personal attack per WD:NPA. The "LawAndOrder" is part of my mission, I want to enforce and want to comply. MrProperLawAndOrder (talk) 01:55, 18 June 2020 (UTC)[reply]
I am sorry but you are not in a position to enforce anything here yet. Concerning complying, the whole topic on AN is full of personal attacks, which I pointed out above.--Ymblanter (talk) 05:44, 18 June 2020 (UTC)[reply]

I have upped your block to indefinite because using CheckUser, I find you Likely related to User:Tamawashi, a globally banned user. You will have to appeal the global ban successfully before you can even be considered for unblock here.--Jasper Deng (talk) 19:37, 19 June 2020 (UTC)[reply]

Hi

What is the point of creating Bahman Farmanara (Q95861178) when Bahman Farmanara (Q2749785) exist? Shkuru Afshar (talk) 10:41, 18 July 2022 (UTC)[reply]