Community Wishlist Survey 2020/Archive/Display recordings with an order based on geolocalization

From Meta, a Wikimedia project coordination wiki
Jump to navigation Jump to search
Random proposal◄ Archive  The survey has concluded. Here are the results!

Display recordings with an order based on geolocalization

NoN Can't be implemented at this moment

  • Problem: Thanks to the recording tool Lingua Libre, Wiktionaries can now display dozen of audio files with pronunciations for each words. It is a great way to hear the diversity of usages and the reality of languages. Unfortunately, with the time going, it is problematic. The pronunciation section is inflating, chaotic and not adapted to the readers. Recording order is not useful, and correcting the order by hand is not possible because of the quantity of files added. Plus, one order do not fit for the diversity of the audience.
  • Who would benefit: Readers of Wiktionaries all over the world
  • Proposed solution: Gathering all audio files in one template and a solid script may allow a display adapted to the geolocalization of users, based on their IP. Doing so, someone from Canada should have a North-American audio first and then European and African ; someone from Mali should have Malian audio, then other African, then Europe and so on. Audio files recorded with Lingua Libre have metadata including a country or a city, so it could be possible to related those data to a geographical database such as geonames, or with Wikidata itself in order to push the recording on a map. Then, the display could order the files following the distance with the user. This require an authorization to use the IP but it could be only the first digits, so we avoid to consider those as a personal data. It could work without any cookies and be session dependent. A v. 2 could allow to specify a different localization as a preference.
  • More comments: Some examples of pages with too many audio files: cinq, arbre, fraise.
  • Phabricator tickets:
  • Proposer: Noé (talk) 08:15, 22 October 2019 (UTC)[reply]

Discussion

  • Working with the display of recording may lead to add a way to record directly into Wiktionaries via Lingua Libre account (to add metadata about the speaker). There is another proposal that suggest to do so, suggesting a different solution, but both could be connected, maybe Face-smile.svg Noé (talk) 10:05, 13 November 2019 (UTC)[reply]

Unfortunately, this type of feature would be hard to do in a wikitext based dictionary, however Wikidata support for Wiktionary is being worked on as we speak, so hopefully in the future this will be more easily doable. But for now, we can't accept this proposal. Regardless, thank you for participating in our survey! MaxSem (WMF) (talk) 06:35, 20 November 2019 (UTC)[reply]

Hi, MaxSem, you wrote that "Wikidata support for Wiktionary is being worked on", and reference is needed. To my knowledge, Wikidata have is own agenda and it doesn't include a support to Wiktionary. Wikidata chose to built a separate registry of words in Lexeme namespace. Maybe Wikibase could be of some help, but it's a separate initiative, and there is nothing related to this need in the community wishlist. Noé (talk) 06:56, 20 November 2019 (UTC)[reply]
(edit: I realize I'm partially off-topic, since the request is not about building maps - I hope that some of what I wrote is still interesting) I think that the solution can be found in the new developments made on Structured Data for Commons. If the recording files on Commons include the right metadata, and since it will soon be possible to query the structured data from Commons in SPARQL (a beta service is already available), just like it is already the case for Wikidata's data (example), it will be possible to create dynamic maps. Starting from there, including the map on a wikipage will be a matter of building the right template (Basque Wikipedia is already integrating dynamic SPARQL maps, unfortunately I can't show you an example because the Graph extension is broken at the moment, but here you can see the template)
So I think you could find a solution that doesn't involve a big development task but will need a bit of creative hacking and discover the joys of SPARQL and Lua :) Lea Lacroix (WMDE) (talk) 17:19, 26 November 2019 (UTC)[reply]
Merci Léa ! Structured data for Commons + SPARQL + Lua = ♡ -- Noé (talk) 17:30, 26 November 2019 (UTC)[reply]
Very interesting idea. I created T239272 to track this feature. Pamputt (talk) 19:39, 26 November 2019 (UTC)[reply]