User:Hjfocs/pst proposal

From Wikidata
Jump to navigation Jump to search

Primary sources tool uplift proposal[edit]

The first and current version of the primary sources tool (PST) stems from the donation of Freebase by Google.[1] Based on community feedback collected since its deployment as a Wikidata gadget,[2][3][4] the StrepHit team submits here a radical uplift proposal, which will lead to the next version of the tool.

Beta version of Primary sources tool

Please note that all the mock-ups referenced in this document are accessible at phab:M218.

Current code base[edit]

  • Back end: written in C++, with SQL storage, deployed as a Web service via REST APIs;[5]
  • front end: written in JavaScript, deployed as a Wikidata gadget.[6]

Goals[edit]

The general goal is to make the tool self-sustainable. To achieve so, the highest priority is given to:

  • Web standards;
  • stability, i.e., choices driven by the Wikidata stable interface policy;[7]
  • programming languages adoption by the Wikimedia community.

In addition, the tool should also become the preferred choice for data releases from third-party providers.[8] This makes even more important the need for a standard release procedure.

User workflow[edit]

The user can approve or reject a new statement suggested by the tool:

  1. given an Item page, the suggested statement is highlighted with a blue background;
  2. the user can approve or reject it by clicking either on the approve claim or on the reject claim links respectively;
  3. after that, the page will update with the new statement in the first case or without it in the second one.

Identically, the tool can suggest new references for an existing statement:[9]

  1. the new reference is highlighted with a blue background;
  2. the user can approve or reject it by clicking either on the approve reference or on the reject reference links respectively;
  3. the user can also see a preview tooltip that shows where the source came from by clicking on preview reference;[10][11]
  4. if the dataset contains fine-grained provenance information, e.g., the text snippet where the suggested statement was extracted,[12] the preview tooltip will highlight that exact piece of information;[13]
  5. in case the interaction between the front end and the back end is not smooth, a tooltip will show up with an alert message.[14]

Primary Sources configuration[edit]

  1. When the user clicks on the gear icon next to the Random Primary Sources item link (cf. the section below) in the main menu on the left sidebar, a modal window will open;[15][16][17]
  2. the user can search and select which dataset to use;
  3. essential information is shown, namely Dataset description, Missing statements and Total statements;
  4. the user can either Save or Cancel the new settings.

Random Primary Sources item[edit]

  1. The user can jump to a random Item containing suggested statements by clicking on the Random Primary Sources item link located in the main menu on the left sidebar;
  2. the item will be randomly picked from the datasets selected in the Primary sources configuration.

Browse Primary Sources[edit]

  1. The user can browse through the suggested statements grouped by property by clicking on the appropriate property link below the Browse Primary Sources menu on the left sidebar;
  2. the user can move back to the top of the page by clicking on the back to top link right below the Browse Primary Sources menu on the left sidebar.

Filter-based tool[edit]

A similar workflow applies to a filter-based tool, located in the Tools menu of the left sidebar.

  1. When the user clicks on the Primary Sources filter link (currently Primary Sources list), a modal window will open;[18]
  2. the user can view a table of suggested statements with eventual references by building filters in several ways:
    • Domain of interest: the user starts typing a domain he or she is interested in and gets autocompletion based on simple constraints, typically the instance of (P31) property. For example, list all the Items that are a chemical compound (Q11173);
    • Property: the user starts typing a property he or she is interested in and gets autocompletion based on property labels. This filter then only shows suggested statements with the given property. For instance, list all the date of birth (P569)s;
    • SPARQL Query: this filter is intended for power users and accepts arbitrary SPARQL queries;
    • Source language: shows only statements in the selected language;
    • Dataset: lets the user pick one or more specific dataset to use, similarly to #Primary_Sources_configuration.

After building the filters, the tool shows a table of statements, where the user can either approve or reject suggestions, after a preview of the reference source, as per the #User_workflow. The approval or reject actions can be blocked if the source preview is not opened.[19]

Architecture[edit]

General architecture proposal for the d:Wikidata:Primary_sources_tool version 2, based on meta:Grants:IEG/StrepHit:_Wikidata_Statements_Validation_via_References/Renewal

Back-end implementation[edit]

Data format[edit]

The tool currently accepts datasets serialized in QuickStatements (Q20084080). While it is indeed a very compact format, useful to upload large datasets, it is totally non-standard: the only available documentation is contained in the QuickStatements service page itself.[20] Hence, we foresee the support of stable formats for both the self-sustainability of the project and a standardized data donation workflow. Still, we will keep the QuickStatements support.

Datasets from third-party providers should be serialized in RDF and follow the Wikidata RDF data model.[21] We believe this is the most standard way for 2 reasons:

  1. RDF is a mature Web standard, being a W3C recommendation since 1999;[22]
  2. The Wikidata RDF export format is claimed to be stable.[23]

Main component[edit]

Given these premises, a Wikidata Query Service[24] instance is a good fit for the back end, since it:

  • uses a RDF triple store, i.e., Blazegraph as the storage engine;[25]
  • is claimed to be a stable Wikidata public API;[26]
  • is written in Java, probably a more adopted programming language compared to the current implementation in C++;
  • has facilities to upload datasets in Wikidata RDF dump format;[27]
  • exposes APIs to access data via SPARQL, specifically useful for both the domain filter and the query text box features.[28]

The main tool will support full statements, while the filter-based tool should be fed with truthy statements.

Ingestion API[edit]

The Ingestion API is responsible for the interaction with third-party data providers. Incoming datasets are first validated against the Wikidata RDF data model. It will then provide the following facilities for datasets:

  • upload;
  • update;
  • drop.

Curation API[edit]

The Curation API is responsible for the interaction with Wikidata users, with 2 main services. It will suggest claims for addition and flag the rejected suggestions in the back-end storage.

Front-end implementation[edit]

The main self-sustainability goal is to avoid breaking the front end whenever a change is made in the Wikidata user interface. To achieve this, the current gadget will become a MediaWiki extension for Wikibase (Q16354758). A major refactoring of the code base is essential and will:

  • include unit tests. Failures are expected in case of changes in the Wikidata user interface, and will break the Wikidata build instead of breaking the tool;
  • make a clear distinction between the interaction with the back end and the users;
  • port the HTML templates.

The code will be split into the 2 typical components of a MediaWiki extension, written in PHP and JavaScript respectively.

PHP component[edit]

The PHP component will be a MediaWiki action API module,[29] responsible for:

  • fetching the data from the tool back end;
  • generating an HTML template based on the PHP implementation[30] of the Wikibase data model[31] Each Snak is processed through the corresponding generateHTML function,[32] which will be filled with the fetched data;
  • returning the template to the JavaScript component.

JavaScript component[edit]

The JavaScript component will:

  • handle the final template rendering. More specifically, it will:
    • query the PHP component for the template and the tool back end for data relevant to the Item;
    • append the template to the existing Item statements when needed;
  • handle the interaction with the user. More specifically, it will:
    • notify the tool back end on approval or rejection of a suggested claim or reference;
    • add an approved claim or reference to Wikidata via the MediaWiki API;[33]
  • implement the features described in #User_workflow.

References[edit]

  1. Pellissier Tanon, T., Vrandečić, D., Schaffert, S., Steiner, T., and Pintscher, L. (2016, April). From Freebase to Wikidata: The Great Migration. In Proceedings of the 25th International Conference on World Wide Web (pp. 1419-1428). ACM (2016)
  2. Wikidata:Requests_for_comment/Semi-automatic_Addition_of_References_to_Wikidata_Statements
  3. Wikidata_talk:Primary_sources_tool
  4. phab:project/view/2788
  5. https://github.com/Wikidata/primarysources/tree/master/backend
  6. https://github.com/Wikidata/primarysources/tree/master/frontend
  7. Wikidata:Stable_Interface_Policy
  8. Wikidata:Data_donation#3._Work_with_the_Wikidata_community_to_import_the_data
  9. Help:Sources
  10. PST - Wireframe 1
  11. PST - Mockup 1
  12. m:Grants:IEG/StrepHit:_Wikidata_Statements_Validation_via_References
  13. PST - Mockup 2
  14. PST - Mockup 3
  15. PST configuration - Mockup 1
  16. PST configuration - Mockup 2
  17. PST configuration - Mockup 3
  18. PS filter - Mockup 1
  19. PS filter - Mockup 2
  20. https://tools.wmflabs.org/wikidata-todo/quick_statements.php
  21. mw:Wikibase/Indexing/RDF_Dump_Format#Data_model
  22. https://www.w3.org/TR/PR-rdf-syntax/
  23. Wikidata:Stable_Interface_Policy#Stable_Data_Formats
  24. mw:Wikidata_query_service
  25. phab:T166503
  26. Wikidata:Stable_Interface_Policy#Stable_Public_APIs
  27. https://github.com/wikimedia/wikidata-query-rdf/tree/master/dist/src/script
  28. phab:T166512
  29. mw:API:Extensions
  30. https://github.com/wmde/WikibaseDataModel
  31. mw:Wikibase/DataModel.
  32. https://doc.wikimedia.org/Wikibase/master/php/
  33. https://www.wikidata.org/w/api.php