Wikidata:Request a query/Archive/2024/01

From Wikidata
Jump to navigation Jump to search
This page is an archive. Please do not modify it. Use the current page, even to continue an old discussion.

Query for what Wikimedia projects are NOT on a given item(technically impossible?)

Rather advance query being asked for, and one, from what mediocre understanding of the Querying, might not be currently possible, at least not "elegantly". This is basically an attempt to see on interlinking Wikimedia projects what's still hasn't been connected, or what is truly missing.

For instance, Water has links to 259 Wikipedia projects, where as There is currently 326 active Wikipedias, so this query should tell me what are the 67 remaining wikipedia projects(plus maybe Wikimedia projects in general) that aren't listed.

A pie in the sky query would be one that could return a list of items that don't have all however many wikimedia projects and list which ones aren't present, though maybe that's better served as a bot?

I suppose the ultimate intention of this is to proof the coverage of Q5460604 or such. Akaibu1 (talk) 01:26, 1 January 2024 (UTC)

@Akaibu1: This WD query starts from the expectation there are 347 language wikipedias ... there may be an additional triple it should be looking at to eliminate some closed wikipedias. Possibly we can get around to finding that. But the query is fairly simple: list all language wikipedias; eliminate from the list those that have a sitelink from a specified item.
SELECT ?item ?itemLabel ?url
WHERE 
{
  hint:Query hint:optimizer "None".
  ?item wdt:P31 wd:Q10876391.
  ?item wdt:P856 ?url .  
  
  minus { ?article schema:about wd:Q283;
                   schema:isPartOf ?url .
        }

  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en". }
}
Try it!
--Tagishsimon (talk) 08:26, 1 January 2024 (UTC)
@Tagishsimon hello, thank you very much for the query! I was able to tweak your provided query to filter out said closed wikis, but I'm sure there's a more efficient way of doing this
SELECT ?item ?itemLabel ?url
::WHERE 
::{
::  hint:Query hint:optimizer "None".
::  ?item wdt:P31 wd:Q10876391.
::  ?item wdt:P856 ?url .
::  
::  FILTER (
::    STR(?url) NOT IN (
::      "https://mo.wikipedia.org/",
::      "https://ak.wikipedia.org/",
::      "https://cho.wikipedia.org/",
::      "https://aa.wikipedia.org/",
::      "https://ho.wikipedia.org/",
::      "https://kr.wikipedia.org/",
::      "https://kj.wikipedia.org/",
::      "https://mh.wikipedia.org/",
::      "https://mus.wikipedia.org/",
::      "https://na.wikipedia.org/",
::      "https://ng.wikipedia.org/",
::      "https://lrc.wikipedia.org/",
::      "https://ii.wikipedia.org/",
::      "https://tokipona.wikipedia.org/",
::      "https://ru-sib.wikipedia.org/"
::      "https://tlh.wikipedia.org/"
::      )
::  )
::  
::  MINUS { 
::    ?article schema:about wd:Q283;
::             schema:isPartOf ?url .
::  }
::  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en". }
::}
::
Try it!
Akaibu1 (talk) 17:58, 10 January 2024 (UTC)
@Akaibu1: Good work; yes - probably look for a dissolved, abolished or demolished date (P576) statement in the item - I think that gets 15 out of your 16.
SELECT ?item ?itemLabel ?url
WHERE 
{
  hint:Query hint:optimizer "None".
  ?item wdt:P31 wd:Q10876391.
  ?item wdt:P856 ?url . 
  ?item wdt:P576 []. 
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en". }
}
Try it!
SELECT ?item ?itemLabel ?url
WHERE 
{
  hint:Query hint:optimizer "None".
  ?item wdt:P31 wd:Q10876391.
  ?item wdt:P856 ?url .  
  filter not exists {?item wdt:P576 []. }
  minus { ?article schema:about wd:Q283;
                   schema:isPartOf ?url .
        }

  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en". }
}
Try it!
--Tagishsimon (talk) 18:20, 10 January 2024 (UTC)
@Tagishsimon looking into it, it seems just a matter of adding the property to Q2996321, though it seems to have a similar property in discontinued date (P2669), not sure if that should be replaced outright with the property or if to leave it. Akaibu1 (talk) 18:50, 10 January 2024 (UTC)
@Akaibu1: I'd be inclined to add a P576; or else add a filter not exists {?item wdt:P2669 []. } to the query. --Tagishsimon (talk) 19:06, 10 January 2024 (UTC)

List of articles in Wikipedia which have specific identifier

I want to search the languages which have ISO 639-3 code and whose articles exits in ja.wikipedia. If possible, I want to another version which uses w:en:List of ISO 639-3 codes or w:en:Category:Redirects from ISO 639 in case some codes haven’t registered. --FlatLanguage (talk) 15:51, 2 January 2024 (UTC)

@FlatLanguage: 3 reports - 1. ja.wiki article exists, 2. all languages showing ja.wiki if it exists 3. no ja.wiki article. Not sure that the reports based on the EN wiki list or category can be done.
SELECT ?item ?itemLabel ?iso ?article ?sitelink
WHERE 
{
  hint:Query hint:optimizer "None".
  ?item wdt:P220 ?iso.
  ?article schema:about ?item ;
  schema:isPartOf <https://ja.wikipedia.org/> ; 
  schema:name ?sitelink .
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en". } 
  
}
Try it!
SELECT ?item ?itemLabel ?iso ?article ?sitelink
WHERE 
{
  ?item wdt:P220 ?iso.
  OPTIONAL {?article schema:about ?item ;
  schema:isPartOf <https://ja.wikipedia.org/> ; 
  schema:name ?sitelink .}
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en". } 
  
}
Try it!
SELECT ?item ?itemLabel ?iso ?article ?sitelink
WHERE 
{
  ?item wdt:P220 ?iso.
  FILTER NOT EXISTS {?article schema:about ?item ;
  schema:isPartOf <https://ja.wikipedia.org/> ; 
  schema:name ?sitelink .}
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en". } 
  
}
Try it!
--Tagishsimon (talk) 18:12, 2 January 2024 (UTC)
Thank you! These are what I want. --FlatLanguage (talk) 23:32, 2 January 2024 (UTC)

Toggle color if data

I have made this query which shows familiar connections based on a given person:

#defaultView:Graph
PREFIX gas: <http://www.bigdata.com/rdf/gas#>

SELECT ?item ?itemLabel ?linkTo
WHERE {
  {
    SERVICE gas:service {
      gas:program gas:gasClass "com.bigdata.rdf.graph.analytics.SSSP" ;
                  gas:in wd:Q111424233 ;
                  gas:traversalDirection "Forward" ;
                  gas:out ?item ;
                  gas:out1 ?depth ;
                  gas:maxIterations 6 ;
                  gas:linkType wdt:P40
    }
    OPTIONAL { ?item wdt:P40 ?linkTo }
  } UNION {
    SERVICE gas:service {
      gas:program gas:gasClass "com.bigdata.rdf.graph.analytics.SSSP" ;
                  gas:in wd:Q111424233 ;
                  gas:traversalDirection "Forward" ;
                  gas:out ?item ;
                  gas:out1 ?depth ;                  
                  gas:maxIterations 6 ;
                  gas:linkType wdt:P26.
    }
    OPTIONAL { ?item wdt:P26 ?linkTo }
     }
  SERVICE wikibase:label {bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en" }
}
Try it!

Now I want to show which persons have a value set for WikiTree person ID (P2949) or not by giving them different colors (like the co-author graph shows different color for male and female scientists). How can I do that? Cavernia (talk) 21:17, 3 January 2024 (UTC)

@Cavernia: I think hex values stuffed into ?rbg is the way to go:
#defaultView:Graph
PREFIX gas: <http://www.bigdata.com/rdf/gas#>

SELECT ?item ?itemLabel ?linkTo ?rgb 
WHERE {
  {
    SERVICE gas:service {
      gas:program gas:gasClass "com.bigdata.rdf.graph.analytics.SSSP" ;
                  gas:in wd:Q111424233 ;
                  gas:traversalDirection "Forward" ;
                  gas:out ?item ;
                  gas:out1 ?depth ;
                  gas:maxIterations 6 ;
                  gas:linkType wdt:P40
    }
    OPTIONAL { ?item wdt:P40 ?linkTo }
  } UNION {
    SERVICE gas:service {
      gas:program gas:gasClass "com.bigdata.rdf.graph.analytics.SSSP" ;
                  gas:in wd:Q111424233 ;
                  gas:traversalDirection "Forward" ;
                  gas:out ?item ;
                  gas:out1 ?depth ;                  
                  gas:maxIterations 6 ;
                  gas:linkType wdt:P26.
    }
    OPTIONAL { ?item wdt:P26 ?linkTo }
     }
  OPTIONAL {?item wdt:P2949 ?WTPID. BIND("88ffff" as ?rgb)}
  SERVICE wikibase:label {bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en" }
}
Try it!
--Tagishsimon (talk) 00:30, 6 January 2024 (UTC)
Thank you. Yes, using the hex values was my idea as well, but didn't know how to formulate the difference between Wikitree ID set or not set. However, the query doesn't seem to work properly. --Cavernia (talk) 10:53, 6 January 2024 (UTC)

Returning an image without url encoding

If I ask for a image (P18) in the SPARQL GUI, I get 'commons:HMS Effingham.jpg', whose spaces are fine with me. But when I download a CSV file, and particularly when I do an API query with

curl -s -H "Accept: application/json;User-Agent: expounder" -G 'https://query.wikidata.org/sparql' --data-urlencode query="BLAH"

I get the result URL encoded as 'File:HMS%20Effingham.jpg'. I actually wanted the version with the spaces, so I can use {{filepath:HMS Effingham.jpg}} to return https://upload.wikimedia.org/wikipedia/commons/2/2f/HMS_Effingham.jpg. Running filepath with HMS%20Effingham.jpg returns a blank.

I could convert %20 back to spaces, but I'm not sure all the other translations that have been done, so would rather have the raw value

There is a urlencode function in Mediawiki, but the urldecode function that was in the ParserFunctions extension was not carried over when the functions were brought in house.

BIND(STR(?image) as ?image1) did not help, the bound value has %20 in Vicarage (talk) 23:24, 7 January 2024 (UTC)

BIND (wikibase:decodeUri(STR(?image)) as ?image1)} does though Vicarage (talk) 06:48, 8 January 2024 (UTC)

Tracks

Could someone make a query of all audio tracks that have been included in an album with the Bandcamp release ID (P11354) identifier? Also also make it so the query shows the ISRC (P1243), MusicBrainz recording ID (P4404), Genius song ID (P6218) and LyricsTranslate ID (P7212) of the tracks.--Trade (talk) 03:09, 8 January 2024 (UTC)

@Trade: This, probably, if we can rely on the P1433 pointer from track item to album item.
SELECT ?album ?albumLabel ?bc ?track ?trackLabel ?ISRC ?MusicBrainz ?Genius ?LyricsTranslate 
WHERE 
{
  ?album wdt:P11354 ?bc.
  ?track wdt:P1433 ?album.
  
  OPTIONAL {?track wdt:P1243 ?ISRC .}
  OPTIONAL {?track wdt:P4404 ?MusicBrainz .}
  OPTIONAL {?track wdt:P6218 ?Genius .}
  OPTIONAL {?track wdt:P7212 ?LyricsTranslate .}

  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en". }                    

} order by ?albumLabel ?trackLabel
Try it!
--Tagishsimon (talk) 11:55, 8 January 2024 (UTC)
Tracks like Hypatia (Q124183922) are still missing. Could you change the query to it relies on tracklist (P658) instead? Trade (talk) 22:35, 8 January 2024 (UTC)
@Trade:
SELECT ?album ?albumLabel ?bc ?track ?trackLabel ?ISRC ?MusicBrainz ?Genius ?LyricsTranslate
WHERE 
{
  ?album wdt:P11354 ?bc.
  ?album wdt:P658 ?track .
  
  OPTIONAL {?track wdt:P1243 ?ISRC .}
  OPTIONAL {?track wdt:P4404 ?MusicBrainz .}
  OPTIONAL {?track wdt:P6218 ?Genius .}
  OPTIONAL {?track wdt:P7212 ?LyricsTranslate .}

  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en". }                    

} order by ?albumLabel ?trackLabel
Try it!
--Tagishsimon (talk) 12:12, 9 January 2024 (UTC)
Is it possible to change the visual appareance of the query to be more like the first? Trade (talk) 12:50, 9 January 2024 (UTC)
@Trade: The SELECTs in both queries are exactly the same; no change. No clear what the issue you're seeing is, but guess it is that WDQS is not displaying data in columns, but as a series of rows? Don't know what the rules are for it deciding to display in one versus the other. Does shortening variable names help at all? If not, then removing columns in the SELECT will eventually sort it.
SELECT ?album ?albumLabel ?bc ?track ?trackLabel ?ISRC ?MB ?G ?LT
WHERE 
{
  ?album wdt:P11354 ?bc.
  ?album wdt:P658 ?track .
  
  OPTIONAL {?track wdt:P1243 ?ISRC .}
  OPTIONAL {?track wdt:P4404 ?MB .}
  OPTIONAL {?track wdt:P6218 ?G .}
  OPTIONAL {?track wdt:P7212 ?LT .}

  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en". }                    

} order by ?albumLabel ?trackLabel
Try it!
--Tagishsimon (talk) 13:20, 9 January 2024 (UTC)

Babelnet

I would need an optimized query for Wikidata:WikiProject Badminton/Babel. Means, find missing Babelnet for all existing articles in all languages (currently the query is limited to de). Excluding categories, templates, wikinews and commons-only links. Thanks in advance. Only on top, not necessary if difficult: If possible (because babelnet is slow in additions), do not list articles where the oldest created wikilink is newer than 6 months. Florentyna (talk) 20:48, 8 January 2024 (UTC)

@Florentyna: Bit unclear whether you want the query for WDQS or Listeria. Probably this, for the former (and drop the call to the wikibase:label service for the latter). To be very clear, this restricts sitelinks to those which point to language wikipedias, excludes commons, wikivoyage &c. The query cannot provide " oldest created wikilink is newer than 6 months":
SELECT ?item ?itemLabel WHERE {
  ?item wdt:P641 wd:Q7291 . hint:Prior hint:runFirst true .
  ?wen schema:about ?item .  
  ?wen schema:isPartOf ?partOf .
  ?partOf wikibase:wikiGroup "wikipedia" .
  FILTER NOT EXISTS { ?item wdt:P31/wdt:P279* wd:Q4167836 . hint:Prior hint:gearing "forward". }
  FILTER NOT EXISTS { ?item wdt:P31 wd:Q11266439 .}
  FILTER NOT EXISTS { ?item wdt:P2581 [] .}
  FILTER NOT EXISTS { ?item wdt:P31 wd:Q15184295 .}
  FILTER NOT EXISTS { ?item wdt:P31 wd:Q11753321 .}
  FILTER NOT EXISTS { ?item wdt:P31 wd:Q17633526 .}
  FILTER NOT EXISTS { ?item wdt:P31 wd:Q19887878 .}
  SERVICE wikibase:label { bd:serviceParam wikibase:language 'en' .}
}
Try it!
--Tagishsimon (talk) 11:21, 10 January 2024 (UTC)

Great! Thank you really very much! --Florentyna (talk) 14:10, 10 January 2024 (UTC)

historical events by a city sorted by importance

Hi I am looking for a query that can return historical events by a city sorted by importance. Can you help? Madsbrydegaard (talk) 20:02, 9 January 2024 (UTC)

How would you personally measure importance? Item size, number of sitelinks or external identifiers... Sjoerd de Bruin (talk) 20:34, 9 January 2024 (UTC)
good question - I am a little new to Wikidata but it seems that sitelinks is a good start. Thank you. Madsbrydegaard (talk) 18:12, 11 January 2024 (UTC)
@Madsbrydegaard: Probably in this sort of direction; find a set of items associated with the city, perhaps looking at location (P276) and/or located in the administrative territorial entity (P131), and then check to see which of those are an instance or subclass of an occurrence (Q1190554). Then grab the sitelink count and order the results.
SELECT DISTINCT ?item ?itemLabel ?sitelinks
WHERE 
{
  { {?item wdt:P276/wdt:P276* wd:Q84 .}
  UNION
  {?item wdt:P131/wdt:P131* wd:Q23306 . } }  hint:Prior hint:runFirst true.

  ?item wdt:P31/wdt:P279* wd:Q1190554. hint:Prior hint:gearing "forward".
  
  ?item wikibase:sitelinks ?sitelinks .
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en". } 
} order by desc(?sitelinks)
Try it!
--Tagishsimon (talk) 23:38, 11 January 2024 (UTC)

Instances of items with only one wikipedia attached

Another query related to getting stuff interwiki'ed. This seems like such an obvious one to look for that I tried looking for various cleanup projects for a query for this but couldn't find it. Not sure if it would be better for it to be ordered by just languages, or if to be more advance and it ordered by both the number of results found per language, with the smallest results on top, mostly because it would basically break whenever an attempt to have it search for items with two or more wikipedias, which would be the logical path after the one wikipedia items got exhausted. Akaibu1 (talk) 21:58, 12 January 2024 (UTC)

@Akaibu1: I'm not convinced. There must be hundreds of thousands of items with only one language wikipedia sitelink. I could give them to you by the 100,000 with itemlabels, or by the 300,000 without. And you'll do what with these huge & unmanageable lists?
SELECT ?item ?itemLabel ?wen WHERE {
  SERVICE bd:slice {
  ?item wikibase:sitelinks "1"^^xsd:integer  .
    bd:serviceParam bd:slice.offset 0 . # Start at item number (not to be confused with QID)
    bd:serviceParam bd:slice.limit 100000 . # List this many items
  } hint:Prior hint:runFirst true.
  
  ?wen schema:about ?item .  
  ?wen schema:isPartOf ?partOf .
  ?partOf wikibase:wikiGroup "wikipedia" .
  
  FILTER NOT EXISTS { ?item wdt:P31 wd:Q11266439 .}
  FILTER NOT EXISTS { ?item wdt:P31 wd:Q15184295 .}
  FILTER NOT EXISTS { ?item wdt:P31 wd:Q11753321 .}
  FILTER NOT EXISTS { ?item wdt:P31 wd:Q17633526 .}
  FILTER NOT EXISTS { ?item wdt:P31 wd:Q19887878 .}
  FILTER NOT EXISTS { ?item wdt:P31 wd:Q4167836 .}
  SERVICE wikibase:label { bd:serviceParam wikibase:language 'en' .}
}
Try it!
SELECT ?item ?itemLabel ?wen WHERE {
  SERVICE bd:slice {
  ?item wikibase:sitelinks "1"^^xsd:integer  .
    bd:serviceParam bd:slice.offset 0 . # Start at item number (not to be confused with QID)
    bd:serviceParam bd:slice.limit 300000 . # List this many items
  } hint:Prior hint:runFirst true.
  
  ?wen schema:about ?item .  
  ?wen schema:isPartOf ?partOf .
  ?partOf wikibase:wikiGroup "wikipedia" .
  
  FILTER NOT EXISTS { ?item wdt:P31 wd:Q11266439 .}
  FILTER NOT EXISTS { ?item wdt:P31 wd:Q15184295 .}
  FILTER NOT EXISTS { ?item wdt:P31 wd:Q11753321 .}
  FILTER NOT EXISTS { ?item wdt:P31 wd:Q17633526 .}
  FILTER NOT EXISTS { ?item wdt:P31 wd:Q19887878 .}
  FILTER NOT EXISTS { ?item wdt:P31 wd:Q4167836 .}
}
Try it!
--Tagishsimon (talk) 00:35, 13 January 2024 (UTC)

All the states that preceded a current country

I want to see all states that preceded the United Kingdom (Q145) using replaces (P1365) and/or follows (P155). I see there are hard-coded 3 level solutions like https://www.wikidata.org/wiki/Wikidata:SPARQL_query_service/queries/examples#Years_with_3_popes, but I want a recursive tree, so I get United Kingdom (Q145) < United Kingdom of Great Britain and Ireland (Q174193) < Kingdom of Great Britain (Q161885) < (Kingdom of England (Q179876), Kingdom of Scotland (Q230791)) (and with the extra problem that Commonwealth of England (Q330362) is in the middle of 2 phases of Kingdom of England (Q179876) so could lead to an infinite loop). I just need a list, not a tree Vicarage (talk) 15:26, 16 January 2024 (UTC)

how about this?
SELECT DISTINCT ?item ?itemLabel WHERE {
  wd:Q145 (wdt:P155|wdt:P1365)+ ?item.
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en". }
}
Try it!
Binarycat32 (talk) 20:38, 16 January 2024 (UTC)
That works a treat. Gosh, what a lot of them! Vicarage (talk) 20:52, 16 January 2024 (UTC)
There's also this sort of thing.
SELECT ?successor ?successorLabel ?predecessor ?predecessorLabel ?depth WHERE
 {
  SERVICE gas:service {
       gas:program gas:gasClass "com.bigdata.rdf.graph.analytics.BFS" ; gas:in wd:Q145 ; gas:linkType wdt:P1365 ; gas:out ?predecessor ; gas:out1 ?depth ; gas:out2 ?successor . }

SERVICE wikibase:label { bd:serviceParam wikibase:language "en". }
} order by ?depth
Try it!
--Tagishsimon (talk) 21:33, 16 January 2024 (UTC)

Content assessment grades for Wikipedia pages

Is there a way to query the content assessment grade of Wikipedia articles returned in a SPARQL query? For example, starting with:

SELECT ?item ?itemLabel ?article WHERE {
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en". }
  ?item wdt:P131 wd:Q18424.
  ?article schema:about ?item .
  ?article schema:inLanguage "en" .
  FILTER (SUBSTR(str(?article), 1, 25) = "https://en.wikipedia.org/")
}
ORDER BY (?itemLabel)
Try it!

Could the query also return a column with some indication of the article's grade/class? I found some discussion of content assessment in this previous query request but am having trouble adapting what was discussed there.

Thank you! --Infopetal (talk) 17:37, 19 January 2024 (UTC)

Yes, this is one of the newer things that was added to Wikidata, see Help:Badges. Although it looks like the english Wikipedia still uses manual templates to mark the articles. I'm not sure how well synced the badges are. Apparently DeltaBot was syncing them at some point in the past. So unless people have been manually adding badges on Wikidata, the badges aren't guaranteed to be synced. Infrastruktur (talk) 21:52, 19 January 2024 (UTC)
SELECT ?item ?itemLabel ?article ?badgeLabel 
WHERE {
  SERVICE wikibase:label { bd:serviceParam wikibase:language "en". }
  ?item wdt:P131 wd:Q18424.
  ?article schema:about ?item .
  ?article schema:isPartOf <https://en.wikipedia.org/> .
  optional { ?article wikibase:badge ?badge . }
}
ORDER BY (?itemLabel)
Try it!
Infrastruktur (talk) 21:52, 19 January 2024 (UTC)

Notability: Structural need

I've been working on a new version of my tool that automatically analyses pages based on the notability criteria (see beta snapshot). I have a number of outstanding issues related to N3 (structural need) that I could use some assistance with. They're separable topics, so I'll use subsections. Bovlb (talk) 20:32, 21 January 2024 (UTC)

Incoming links

I want to detect whether an item is linked to by another item. I originally used:

SELECT * WHERE {
        VALUES ?x { wd:Q42 }
          # N3: It fulfills a structural need, for example: it is needed to make statements made in other items more useful.
          BIND(EXISTS { # ?notability3_strong
            ?statement ?ps ?x .
            ?other ?p ?statement .
            ?prop wikibase:claim ?p .
            FILTER(?other != ?x)
          } AS ?notability3_strong)
}
Try it!

Unfortunately, while this works, the performance is terrible for items that have many inlinks. I have tried various alternatives. The best I have found so far is:

SELECT * WHERE {
        VALUES ?x { wd:Q42 }
              # N3: It fulfills a structural need, for example: it is needed to make statements made in other items more useful.
              BIND(EXISTS { # ?notability3_strong
                #hint:Query hint:optimizer "None" .
				?property wikibase:directClaim ?p .
                #hint:Prior hint:runFirst true .
				?other ?p ?x  .
                FILTER(?other != ?x)
              } AS ?notability3_strong)
  }
Try it!

This is better, but still has performance problems.

I did look at using the `linkshere` API, but I couldn't get the SPARQL federation to produce any results. Bovlb (talk) 20:32, 21 January 2024 (UTC)

Entity usage

Sitelinks are not the only use that client projects make of Wikidata items. If you look at "Page information", you can see a list of wikis subscribed to this entity. This includes sitelinks and other uses.

Is there any way to query this list in SPARQL? Bovlb (talk) 20:32, 21 January 2024 (UTC)

Query usage in SDC

Also on "Page information" is a link "query usage in SDC", which accesses the Wikimedia Commons Query Service[1].

Is there any way to access this information via the WDQS and federation? Bovlb (talk) 20:32, 21 January 2024 (UTC)

Open Street Map

I have a gadget to check for OSM use. Is there any way to incorporate that in SPARQL federation? Bovlb (talk) 20:32, 21 January 2024 (UTC) Bovlb (talk) 20:32, 21 January 2024 (UTC)

General discussion

Incoming links and entity usage is much easier to fetch from the classical MediaWiki API. Do you have reasons to limit yourself to SPARQL only? SDC queries, as much as I am aware, only work on the SDC endpoint with authentication. Would be odd if one could bypass the authentication with federation at the un-authenticated WDQS endpoint. —MisterSynergy (talk) 20:46, 21 January 2024 (UTC)

Hmm. Having all the questions about one item answered in a single API call makes the design of my code simpler and also makes it possible to show results incrementally. Another approach I could take is to build a specialized tool that accesses multiple services and answers all the relevant questions in a single call. That would also be a good place to introduce caching. This would make it harder for someone to clone and improve my tool. Bovlb (talk) 21:57, 21 January 2024 (UTC)
Not sure how exactly you plan to use the gadget; however, it seems much cleaner to me to query one condition per request, at the best possible API that can answer your problem, and put all responses together via Javascript. The tool would make a series of simple requests rather than one large complicated one, but it should in general be much more efficient and scalable than your current approach. —MisterSynergy (talk) 22:13, 21 January 2024 (UTC)
OK. You might be right. I'm going to have to think about it. I usually try to avoid getting too complicated in the browser, although I realise that I may alone there. :) A custom tool, on the other hand, would allow for a better caching policy. Either way, breaking things up from a monolithic SPARQL query would likely be more efficient, and would open the door for more detailed reporting.
The usage will be to annotate every item mentioned on a page (e.g. Recent Changes, User Contributions, Abuse Filter) with its notability status. This lets you see at a glance whether there's a pattern of non-notable or empty items. The beta I posted above roughly works for that. Bovlb (talk) 23:10, 21 January 2024 (UTC)

Regarding checking for incoming links, this can get complicated to do in SPARQL if you want to cover all the bases. An item can be linked to from a reference, from a qualifier and from a statement claim. It should be sufficient to check statement claims without checking direct claims and value nodes since non-truthy statements don't have direct claims and if there is a statement claim you can infer there is a value node claim. This is much simpler to do with a direct API call. I think the API uses the elasticsearch engine under the hood, which actually maintains a reverse mapping, so this makes it fast.

Entity usage: API call here too.

Open Streetmap: If you're checking for links from OSM to Wikidata here, that is sort of possible at the moment. The official SPARQL endpoint hasn't been updated since the middle ages, but Qlever maintains a reasonably up-to-date version of the OSM metadata+data on their "OSM Planet" public endpoint. Unfortunately I haven't been able to coax Lydia into whitelisting it yet, so you you will have to grab the data from the endpoint directly without being able to do federation. I can probably make an example query for this if you want, but I don't see any advantages over simply using the Overpass API, unless you intend to do things in bulk. Infrastruktur (talk) 22:50, 21 January 2024 (UTC)

list of Q values?!

sorry if this has maybe been asked already ... I would like to be able to "browse" Q values. maybe I can run a query that will show me some definable range, like Q200 thru Q300. So, the results would be the item name, then pull in the first few short descriptions for each result, then maybe the first one or two statements of each result like "subclass of" or "instance of". is this feasible? I would like to be able to run it for any range, within reason. like tomorrow I might want to look at Q4200 thru Q4300. make sense? thanks in advance, Skakkle (talk) 18:12, 22 January 2024 (UTC)

Any particular reason why you chose a nick that translates to "LOL" in verbal norwegian while you say you are from Pennsylvania? Granted you also say you are interested in language, but I also happen to be a native. Infrastruktur (talk) 20:20, 22 January 2024 (UTC)
it's a coincidence. interesting to hear that though! it was just a funny sequence of sounds to my English ears, but I chose it in 2009, so I barely remember. Skakkle (talk) 20:28, 22 January 2024 (UTC)
I guess it does sound funny and rhymes with "cackle". combines lexemes (P5238)skakk (L656132)le (L303773) Infrastruktur (talk) 16:38, 23 January 2024 (UTC)
You might be interested in the enumitems gadget in your preferences. author  TomT0m / talk page 11:24, 23 January 2024 (UTC)
thanks, that's cool, but I'm still hoping to be able to run a list that's like 50-100 results per page, or similiar. Skakkle (talk) 20:38, 23 January 2024 (UTC)

Get P-elements by creation user

Hello! I'm trying to figure out how to get all the properties I created in Wikidata.

I'd like to get all the properties, and filter them by creation user equal to Luca.favorido (that it's me). :-) But it seems not so straightforward...

Any suggestion or help? Thanks! Luca.favorido (talk) 21:03, 22 January 2024 (UTC)

For some reason the user contributions page times out when I try to filter your page creations in the property namespace. I guess the production servers must be under some serious load. It only takes 2.5 seconds to get the same data from the replicas. You can use Quarry which is probably the simplest, but there is also a new service called Superset.
use wikidatawiki_p;
select rev_timestamp, page_title from revision_userindex join page on rev_page=page_id where rev_actor=(select actor_id from actor where actor_name='Luca.favorido') and rev_parent_id=0 and page_namespace=120 order by rev_timestamp desc limit 1000;
HTH. Infrastruktur (talk) 12:51, 23 January 2024 (UTC)
Thanks @Infrastruktur!
Wow, interesting approach, I never used Quarry before. Later, I'll try this query. :-) Luca.favorido (talk) 13:14, 23 January 2024 (UTC)

All short description in Thai

How can I retrieve all Thai short descriptions (item descriptions) for every item, including their linked Wikipage name for thwiki and their first P31 label? If feasible, is it possible to filter out template/project page items? This is for my personal study to generate short descriptions for Thai Wikipedia, and it will contribute to creating a substantial dataset for my project. I appreciate any suggestions. Thank you in advance! —Patsagorn Y? 08:30, 25 January 2024 (UTC)

@Patsagorn Y.: Probably something like this: https://qlever.cs.uni-freiburg.de/wikidata/BzKiSG ... there's no concept of a first P31, so you get all P31s, and I made the presumption you wanted Thai language labels. The download option (e.g. as CSV) gives you ~197k rows) --Tagishsimon (talk) 21:07, 25 January 2024 (UTC)

I need 2 Query. 1. For getting all distinct relations name and their id whose 2nd entity is numeric object for example like population,GDP etc this type of relations. 2. Among all those numeric entities i need statistics of all those relations where start and end date associated with it.

I need 2 Query. 1. For getting all distinct relations name and their id whose 2nd entity is numeric object for example like population,GDP etc this type of relations. 2. Among all those numeric entities i need statistics of all those relations where start and end date associated with it. Subhendussahoo (talk) 14:50, 26 January 2024 (UTC)

Converting a generated property from wd:X to wdt:X

For organisations doing a certain thing, I can get their external ID properties, but can't see how to convert the wd:P4750 I get to a wdt:P4750 value to use in the middle of a triplet to find things with that property. So I'm looking for a way of uncommenting the line in the middle, so its not hard coded.

SELECT DISTINCT ?item ?itemLabel ?register ?registerLabel ?issuerLabel  ?valueLabel WHERE {
  SERVICE wikibase:label { bd:serviceParam wikibase:language "en-GB,en,fr,de,es,pt,pl,nl,cs". }
  { SELECT DISTINCT ?register ?issuer ?value WHERE {
    
    ?register wdt:P31 ?instance.
    VALUES ?instance {wd:Q23779665}
    
    ?register wdt:P2378 ?issuer. ?issuer wdt:P17 wd:Q145.
    
    #?item ?register ?value
    ?item wdt:P4750 ?value
    }
  }
}
Try it!

Vicarage (talk) 13:58, 26 January 2024 (UTC)

Took from the data model reference mw:Wikibase/Indexation/Format RDF Dump
: wd:P22 a wikibase:Property ;
:     rdfs:label "Item property"@en ;
:     wikibase:propertyType wikibase:WikibaseItem ;
:     wikibase:directClaim wdt:P22 ;
:     wikibase:claim p:P22 ;
:     wikibase:statementProperty ps:P22 ;
:     wikibase:statementValue psv:P22 ;
:     wikibase:qualifier pq:P22 ;
:     wikibase:qualifierValue pqv:P22 ;
:     wikibase:reference pr:P22 ;
:     wikibase:referenceValue prv:P22 ;
:     wikibase:novalue wdno:P22 .
:
You can use ?prop wikibase:directClaim ?propwdt ; author  TomT0m / talk page 14:15, 26 January 2024 (UTC)
So as a worked example:
SELECT DISTINCT ?item ?itemLabel ?register ?registerLabel ?issuerLabel  ?valueLabel WHERE {
  SERVICE wikibase:label { bd:serviceParam wikibase:language "en-GB,en,fr,de,es,pt,pl,nl,cs". }
  { SELECT DISTINCT ?item ?register ?issuer ?value WHERE {
    
    ?register wdt:P31 ?instance .
    VALUES ?instance {wd:Q23779665}
    
    ?register wdt:P2378 ?issuer . ?issuer wdt:P17 wd:Q145 .
    ?register wikibase:directClaim ?wdt_predicate .
    
    ?item ?wdt_predicate ?value .
    }
  }
}
Try it!
--Tagishsimon (talk) 15:42, 26 January 2024 (UTC)
Thanks to you both. I spotted DirectClaim in the examples, but its use wasn't clear Vicarage (talk) 09:08, 27 January 2024 (UTC)

Retrieving anything which is a body of marine water

Hello. I want to retrieve the name and coordinates of all entries that are a body of marine water. I would include oceans, seas, gulfs, bays, straits and passages. There maybe others such as reef for example. I have tried to do a query using subclass of body of water (Q15324) but this timed out.

When I used marine water body (Q116126039)  it only returned about 270 results which is far to few.

As well as the name and latitude and longitude, I would also like to know the "parent" entry. Part of P361. So North Atlantic is part of Atlantic Ocean. I hope someone finds this interesting and can help. Kind regards. Chessel85 (talk) 21:52, 26 January 2024 (UTC)

The WD data model is poor when it comes to assembling the sort of report you're after. By way of example, there are ~70k bay items in WD, but only ~1k of them disclose which waterbody they're a part of - https://w.wiki/8yMy . I think much the same pattern will hold for other marine objects. So it is possible to put together a report starting with seas and oceans - P31/P279 Q116126039 and then looking for things that are P361 or P206 the sea, but you'll get ~2% of marine items. If you look for items that are not connected to a sea by P361 or P206, then you'll get freshwater items as well as marine water items. :( --Tagishsimon (talk) 22:23, 26 January 2024 (UTC)

Help populating a new Category in Simple EN

Is it possible to search W:People from the Gaza Strip and all its subcategories, to obtain which articles exist in the Simple Wikipedia? The purpose is to (manually) populate the new Simple:Category:People from the Gaza Strip? Thank you - I'm not advanced enough to write my own SPARQL but appreciate the results! -- Deborahjay (talk) 11:32, 31 January 2024 (UTC)