Community Wishlist Survey 2020/Wiktionary/More Lua memory for Wiktionary

From Meta, a Wikimedia project coordination wiki
Jump to navigation Jump to search
Random proposal ►

◄ Back to Wiktionary  The survey has concluded. Here are the results!


  • Problem: Lack of Lua memory for basic words. See wikt:CAT:E for currently affected words.
  • Who would benefit: All users and readers of Wiktionary.
  • Proposed solution: More Lua memory for Wiktionary.
  • More comments:
    • Pages that lack memory are not being properly categorized and the sortkey is not working properly.
    • Standard information such as citations, semantically related terms are being removed as a temporary solution and this is a loss of information for our readers.
  • Phabricator tickets: phab:T188492
  • Proposer: KevinUp (talk) 16:20, 11 November 2019 (UTC)

Discussion

Alternatively, consider implementing a tool so that the source of each language can have its own separate page, like how all the proposals have its own individual page. KevinUp (talk) 16:20, 11 November 2019 (UTC)

The most parsimonious solution is to raise the cap on Lua memory. This cap seems arbitrarily placed, and it is crippling for long pages. Metaknowledge (talk) 18:04, 11 November 2019 (UTC)
Indeed. If all were like in a company we would already have more memory, because the time spent to circumvent the cap is dearer than the trifling amount of more memory that in total would be used (since it concerns but some dozens of pages, for which much dust has been raised). Fay Freak (talk) 21:52, 15 November 2019 (UTC)
@Noé: Hi. Does the French Wiktionary have the same issue (lack of Lua memory) for entries with short words? I think if we were to migrate information from English Wiktionary to French Wiktionary, the same situation would occur. KevinUp (talk) 23:32, 16 November 2019 (UTC)
Local community discussion regarding lack of Lua memory can be found here, here and here. If only the Community Tech team would be graceful enough to inform us the actual memory that is needed by wikt:do, wikt:一, wikt:人, wikt:水, wikt:月, wikt:生, wikt:我 which are basic words. KevinUp (talk) 20:15, 18 November 2019 (UTC)
@Pamputt: Hi. Does the French Wiktionary currently have issues with lack of Lua memory? Do you think the same issue would occur if information from English Wiktionary for entries such as wikt:do, wikt:一, wikt:人, wikt:水 were copied to the French Wiktionary? KevinUp (talk) 11:56, 19 November 2019 (UTC)
@KevinUp: actually we do not use that much Lua module on the French Wiktionary so I am not aware of such limitations. Yet, maybe JackPotte or Darkdadaah can say more. Pamputt (talk) 18:50, 19 November 2019 (UTC)
We do use Lua modules quite a bit, although not as much as the English Wiktionary. We don't have as much metadata (languages in particular) and automated content, so I believe we are not at the point where memory is an issue... yet. Our main issue at the moment I believe may be for pages with lots of translations, as in this demonstration page which purports to list the word for "water" in all languages: wikt:fr:Utilisateur:Pamputt/eau. In that case though the limit is execution time (>10s).
For the memory issue, it would be nice to have an idea of what takes so much memory, so that we can make an informed decision on how much memory will be needed. Darkdadaah (talk) 16:12, 21 November 2019 (UTC)
Thanks for the reply. For translations, some entries in English Wiktionary use wikt:Category:Translation subpages to redirect content and reduce Lua memory. Yes, it would be nice if we knew how much memory is actually needed by pages in wikt:Category:Pages with module errors. KevinUp (talk) 07:15, 22 November 2019 (UTC)
@Lo Ximiendo: Does the Chinese Wiktionary use a lot of Lua memory as well? I noticed that it uses similar modules from English Wiktionary. KevinUp (talk) 05:13, 30 November 2019 (UTC)
@KevinUp: All I know is, that the Lua modules in the Chinese were imported from the English Wiktionary, with modifications made of course. --Lo Ximiendo (talk) 05:29, 30 November 2019 (UTC)
There's also a phabricator task for better memory profiling support, which would allow us to do targeted optimizations instead of blind guesswork. – Jberkel (talk) 08:28, 2 December 2019 (UTC)

Voting

  • Support Support as proposer. KevinUp (talk) 03:45, 21 November 2019 (UTC)
  • Support Support Metaknowledge (talk) 05:32, 21 November 2019 (UTC)
  • Support Support Mellohi! (talk) 14:05, 21 November 2019 (UTC)
  • Support Support Smashhoof (talk) 19:36, 21 November 2019 (UTC)
  • Support Support Pamputt (talk) 21:29, 21 November 2019 (UTC)
  • Support Support Sadads (talk) 21:44, 21 November 2019 (UTC)
  • Support Support Libcub (talk) 08:56, 22 November 2019 (UTC)
  • Support Support DCDuring (talk) 03:54, 23 November 2019 (UTC)
  • Support Support Andrew Sheedy (talk) 04:09, 23 November 2019 (UTC)
  • Support Support  --Lambiam 09:33, 23 November 2019 (UTC)
  • Support Support I'd like to see Lua experts work closely with the enwiktionary community to resolve this issue. There will be flow-on benefits for all Wiktionary versions. This, that and the other (talk) 01:44, 24 November 2019 (UTC)
  • Support SupportMnemosientje (t · c) 12:13, 25 November 2019 (UTC)
  • Support Support JogiAsad (talk) 12:59, 25 November 2019 (UTC)
  • Support Support DemonDays64 (talk) 14:42, 25 November 2019 (UTC)
  • Support Support Liuxinyu970226 (talk) 15:50, 25 November 2019 (UTC)
  • Support Support 17:09, 25 November 2019 (UTC)
  • Support Support Even though I believe this will not be enough long term. Long term solution is IMO to get translations and other interrelations between words from Wikidata Lexemes where there are no such limits. So9q (talk) 18:47, 25 November 2019 (UTC)
    @So9q: there may be other limits with Wikidata espaecially when it is queried. Pamputt (talk) 06:47, 26 November 2019 (UTC)
    @Pamputt: Do you mean time out? That is a problem, but only for very large collections (multiple millions) of humans, or with lots of filtering on something that does not have P31. None of our current lexeme queries never times out or are even close, that might change of course when the number of lexemes grow.--So9q (talk) 09:19, 26 November 2019 (UTC)
  • Support Support Ilawa-Kataka (talk) 19:11, 25 November 2019 (UTC)
  • Support Support Vorziblix (talk) 05:15, 26 November 2019 (UTC)
  • Support Support Thibaut120094 (talk) 16:43, 26 November 2019 (UTC)
  • Support Support Erutuon (talk) 21:16, 26 November 2019 (UTC)
  • Support Support Gce (talk) 21:28, 27 November 2019 (UTC)
  • Support Support Curious (talk) 22:20, 29 November 2019 (UTC)
  • Support Support Canonicalization (talk) 10:08, 30 November 2019 (UTC)
  • Support Support 94rain Talk 12:55, 30 November 2019 (UTC)
  • Support SupportTom 144 (talk) 16:37, 30 November 2019 (UTC)
  • Support Support AryamanA (talk) 17:28, 30 November 2019 (UTC)
  • Support Support Justinrleung (talk) 18:58, 30 November 2019 (UTC)
  • Support Support Equinox (talk) 10:44, 1 December 2019 (UTC)
  • Support Support Jberkel (talk) 23:26, 1 December 2019 (UTC)
  • Support Support Seems to be rather easy to fix this, but still +1 Sannita - not just another it.wiki sysop 13:08, 2 December 2019 (UTC)
  • Support Support - TheDaveRoss (talk) 15:35, 2 December 2019 (UTC)
  • Support Support Novak Watchmen (talk) 17:48, 2 December 2019 (UTC)