Wikibase Community User Group/Meetings/2023-02-23

From Meta, a Wikimedia project coordination wiki

Online meeting of the Wikibase Community User Group.


Participants (Who is here?)[edit]

  1. Laurence 'GreenReaper' Parry (WBUG/
  2. David Lindemann (UPV/EHU,
  3. Andra
  4. Mairelys
  5. Wai-yin
  6. Alexander Pacha
  7. Jeff Goeke-Smith
  8. Andreas
  9. Evelien (WMDE)
  10. Jon Amar (WMDE)
  11. Giovanni Bergamin
  12. Peter
  13. Jose Emilio Labra Gayo
  14. Sandra Fauconnier
  15. Eduards Skvireckis
  16. James Hare


  • 17:00h - 17:05h Welcome everyone!
  • 17:10h - 17:40h Presentations, 'Data Modelling of individual users or collectives using Wikibase'
  • 17:40h - 18:00h Exchange and time for questions


David Lindemann
"Data Modelling of individual users or collectives using Wikibase"
  • Use of NOVALUE statements not as on Wikidata (see "value attributed to a claim when we are sure that the property has no value for an element", e.g. as value for Wikidata P40, when sb. has NO children) but as placeholder for the value obtained through reconciliation (OpenRefine), example (with NOVALUE for person items) versus (with reconciled values). The example is from a workflow where we upload literal values (person names), and then reconcile these against person items existing in our wikibase using OpenRefine. This was also part of what we presented here:
  • Existing statements are updated from novalue to somevalue vs. deleting whole statement and creating another one. Useful for dealing with literal values you want to reconcile. This is not what is recommended for novalue statements and wouldn't be suitable for use on Wikidata, but fits own use-case.
  • Doesn't Author and Author Name lead to the same result? - Yes, but maybe literal value is not the name of the entity - don't lose track of the original name form, leave it as qualifier for the reconciled author statement.
  • Presented (at WikiDataCon)
CIDOC-CRM "Modelling linked data in Wikibase"

Slides: - when I worked with it it is in the context of linked data cloud, not a thing on its own. Relying on linked data and RDF. Start out with an empty wikibase there is an option to clone but I avoid that as much as possible, both have a life of their own so diverge, will be out of sync - it's only useful if you want to stick to their properties, but otherwise copy them locally or just the ones you need in your own wikibase. Means it won't follow the same property numbers. e.g. P3 becomes skos:exactMatch, saying e.g. Px == P31. A couple of approaches, either mint a property like "birth date" (how it is normally done) or use "is a feature", "value", "feature kind" = "date of birth". Issue is that you need to know lots of properties, or alternatively use a catalog code. We wanted to store CDOC-CRM data in wikibase, which means you can rely on definitions provided by an external party, like OLS ontologies. Can we reuse these in Wikibase and save time? There is a constraint in the ontology, and getting it into Wikibase can be challenging but it is possible to flatten it - known as "boxology" ( to define the space of the model on the whiteboard first. T-rex taxonomy used as an example with Entity Schema. We wrote a bot to import data (not complex vs. representing data that came from an Excel sheet vs. deciding on the two models of property modelling). The most important part is probably defining an Entity Schema. Either manually create the properties or ideally use a bot to import from the entity schema. If we could build "boxology" on top of Wikibase and create a kind of OpenRefine minting of models that would be great. Rushing is not good.