Language committee/Archives/2007-10

From Meta, a Wikimedia project coordination wiki
For a summary of discussions, see the archives index.

Spanned discussions[edit]

The following discussions span multiple months and are archived in the first applicable archive:

Irregular conditional approvals[edit]

No consensus is reached on conditional approval without discussion, notification, or explanation.

  1. Jesse Plamondon-Willard (Pathoschild)
    01 October 2007 02:48

    Hello,

    GerardM conditionally approved the Bikol Wikipedia last month <http://meta.wikimedia.org/wiki/Requests_for_new_languages/Wikipedia_Bikol?diff=657005>. I can't find any relevant discussion in the archives; where was this decision made? The page also does not specify the remaining conditions to be met, which is very important to explain when the subcommittee conditionally approves a request.

  2. Jesse Plamondon-Willard (Pathoschild)
    01 October 2007 03:03

    The request for an Elfdalian Wikipedia was also rejected a few weeks ago <http://meta.wikimedia.org/wiki/Requests_for_new_languages/Wikipedia_Elfdalian?diff=669702>. I can't find any relevant discussion for this rejection either. Please don't close requests without at least notifying the rest of the subcommittee on the mailing list.

    There are a number of procedural oversights in the rejection as well (the comment is misplaced, the request isn't listed as rejected, and it was not removed from the main page); fixing these is another reason for notifying the rest of the subcommittee.

  3. Jesse Plamondon-Willard (Pathoschild)
    01 October 2007 03:08

    The requests for Erzya and Gan Wikipedias were also closed without the important explanatory comment, and the Erzya Wikipedia apparently with neither discussion nor notification.
    <http://meta.wikimedia.org/wiki/Requests_for_new_languages/Wikipedia_Erzya_3?diff=645397>
    <http://meta.wikimedia.org/wiki/Requests_for_new_languages/Wikipedia_Gan?diff=645400>

  4. Jesse Plamondon-Willard (Pathoschild)
    01 October 2007 03:16

    Hiligaynon Wikipedia closed without discussion or notification, no comment explaining the remaining criteria for approval: <http://meta.wikimedia.org/wiki/Requests_for_new_languages/Wikipedia_Hiligaynon_2?diff=645403>

    Sassarese Wikipedia closed with no explanation of the remaining criteria for approval: <http://meta.wikimedia.org/wiki/Requests_for_new_languages/Wikipedia_Sassarese?diff=669811>

  5. Jesse Plamondon-Willard (Pathoschild)
    01 October 2007 03:21

    No explanatory comment:
    <http://meta.wikimedia.org/wiki/Requests_for_new_languages/Wikipedia_Ingush?diff=645404>

    Despite appearances, I'm going through recently closed requests (not GerardM's contributions).

  6. Jesse Plamondon-Willard (Pathoschild)
    01 October 2007 03:46

    No comments:
    <http://meta.wikimedia.org/wiki/Requests_for_new_languages/Wikipedia_Pökoot?diff=657163>
    <http://meta.wikimedia.org/wiki/Requests_for_new_languages/Wikipedia_Tachelhit?diff=645422>
    <http://meta.wikimedia.org/wiki/Requests_for_new_languages/Wikipedia_Tarifit?diff=645421>

    Thanks to Shanel for finding many of these incomplete closures.

  7. Gerard Meijssen (GerardM)
    01 October 2007 11:23

    <this user has not agreed to public archival.>

  8. Gerard Meijssen (GerardM)
    01 October 2007 11:24

    <this user has not agreed to public archival.>

  9. Gerard Meijssen (GerardM)
    01 October 2007 11:27

    <this user has not agreed to public archival.>

  10. Michal Zlatkovský (Timichal)
    01 October 2007 13:59

    My understanding is different: they can start in the incubator anytime, conditional approval just encourages them to do so and means they haven't met all the specified conditions yet.

    I've always opposed and will oppose the idea of langcom deciding who can or can't start a test on incubator.

    On 10/1/07, GerardM wrote:
    <this text is quoted from a user who has not agreed to public archival.>

  11. Gerard Meijssen (GerardM)
    01 October 2007 14:07

    <this user has not agreed to public archival.>

  12. Bèrto 'd Sèra
    01 October 2007 14:17

    Hi,

    In anyway, this is but an idea. I’m not aware of this being an applied rule. I wish it was, but I’m not aware of it being so. “Conditional approval” at the moment actually means that LangCom has no reason to say that the request will be refused, if all other params as per policy will be met.

  13. Gerard Meijssen (GerardM)
    01 October 2007 14:33

    <this user has not agreed to public archival.>

  14. Jesse Plamondon-Willard (Pathoschild)
    01 October 2007 21:59

    Hello,

    Conditional approval is a statement that the subcommittee will approve that request once *specific* requirements are met, if no exceptional problems arise. This should only be given by subcommittee decision, just like rejection and final approval. If you think a request should be conditionally approved, mention it on the mailing list and do so in a few days if nobody objects.

    The policy itself states this: 'If discussion and past experience indicates that the project is a good idea and would prosper, the language subcommittee will conditionally approve the language (with the "{{ls-header|conditional|comment ~~~~}}" template). The conditions for final approval will be explained in the header.'

    When we do conditionally approve a request, we should also follow up and help them achieve full approval. This includes explicitly stating what remains to be done, and updating that statement as they progress. If they are localizing, there is a template on <http://meta.wikimedia.org/wiki/Special_projects_subcommittees/Languages/Handbook> to put on the talk page that explains how to do so, which messages are optional, and how to contact a responsive member of the subcommittee if they have any questions or suggestions.

    If you don't have the time to guide communities through the process, then simply propose them for conditional approval on the mailing list and I or another member will do so.


  15. 02 October 2007 19:40

    <this user has not agreed to public archival.>

  16. Gerard Meijssen (GerardM)
    02 October 2007 22:10

    <this user has not agreed to public archival.>


  17. 03 October 2007 00:21

    <this user has not agreed to public archival.>

  18. Gerard Meijssen (GerardM)
    03 October 2007 20:35

    <this user has not agreed to public archival.>

Wikiquote Limburgish, Wikisource Armenian & Volapuk, Wikiversity Greek[edit]

The requests for an Armenian Wikisource & Volapük Wikisource were conditionally approved, and no action was taken on the requests for a Limburgish Wikiquote & Greek Wikiversity.

  1. Shanel Kalicharan
    01 October 2007 04:34

    Hello all,

    I propose the conditional approval of the following requests:

    ==Limburgish Wikiquote==
    The language is suitable, there is a large interested community, and the test project is active. The only step that remains is completion of the translation of the basic user interface.

    ==Armenian Wikisource==
    The test project on Oldwiksource has been doing very well, and there is an interested community. All that is left to do is continue translating the interface.

    ==Volapuk Wikisoure==
    While there is only one major contributor to the test project, he has been quite active and dedicated, adding around 550 texts. I am sure he will be able to find other contributors and to translate the interface as needed. In addition, he is working with a corpus of only 2000-3000 texts, so it is well within the realm of possibility for him to do it alone or with only a few other contributors.

    ==Greek Wikiversity==
    The language is suitable, and there is a large interested community. However, there is a bit of the interface left to translate, and the test project does not appear to be very active yet.


    I also rejected the request for an Elven Wikipedia.

  2. Bèrto 'd Sèra
    04 October 2007 08:11

    Ok for all 5 :-)

  3. Gerard Meijssen (GerardM)
    04 October 2007 08:43

    <this user has not agreed to public archival.>

Revoke conditional approval of Tachelhit Wikipedia[edit]

No action was taken on the request for a Tachelhit Wikipedia.

  1. Jesse Plamondon-Willard (Pathoschild)
    04 October 2007 00:35

    Hello,

    I propose revoking the conditional approval of the Tachelhit Wikipedia <http://meta.wikimedia.org/wiki/Requests_for_new_languages/Wikipedia_Tachelhit>.

    GerardM conditionally approved it without subcommittee discussion in early August, but it fails nearly every requirement except the ISO code:

    The user has asked individual subcommittee members how to proceed now that it is conditionally approved, since GerardM did not include the required comment explaining what remains to do (which is everything).

    Conditional approval requires subcommittee consensus. It is not given automatically to all languages with ISO codes <http://meta.wikimedia.org/wiki/WM:LPP#Requisites> (rather, those without should be rejected by subcommittee consensus).

  2. Bèrto 'd Sèra
    04 October 2007 02:10

    Hi!

    Those without an ISO are rejected automatically. It always happened, it's a standard and it doesn't require any discussion, although it's nice to notify LangCom on doing it. So far, conditional approval just verifies the ISO code presence, plus side cases, as in historical dead languages, that make for a different policy (and have been mostly rejected even when they had a code). Once a code is there and the language is a living language the light is always green on preapproval, this doesn't need discussion, either (again, it's better to notify LangCom for the record, anyway).

    We work by precedents, to avoid unnecessary overload and most notably to avoid confusion. The policy must be clear and consistent with its own self, unless we want to make specific preferences (which are unacceptable in principle). Our work is pretty much automatic, as far as preapproval is concerned.

    So I can't see on what grounds we'd revoke their pre-approval, since nobody took off their code. We always said there is NO final term for a project to evolve into a wiki, so we cannot now judge people as not-preapproved just because they show low activity. We already have cases (say sakha) in which editors cut and past not translated content from other wikies in order to appear 'productive' (and hope we cannot tell Russian from actual sakha). I can't see why we should encourage this behavior by repressing people because they are 'slow'.

    Let them/him work. They have a legal code, if they end up forming a community they will get their wiki, if not, they will rest in peace. We don't expect those few millions of native speakers in Morocco to be all connected to the net or to be galvanized in minutes, do we? A couple of months is too early too judge. Give them their chance, from an IT POV Morocco is NOT New York City and their situation is objectively difficult.

  3. Jesse Plamondon-Willard (Pathoschild)
    04 October 2007 02:43

    Bèrto,

    Conditional approval is not the "pre-approval" you describe. The policy at <http://meta.wikimedia.org/wiki/WM:LPP> explains what conditional approval is and what the specific requirements are to attain it.

    Conditionally approving a request acknowledges that all the basic criteria are satisfied, and that they can safely work on the criteria for final approval without worrying that they'll be rejected in the end. It is *not* required to work on a request, and should not be given if the basic criteria have not been met.

    Having an ISO code is not the sole requirement. In particular, "The language must be sufficiently unique that it could not coexist on a more general wiki. In most cases, this excludes regional dialects and different written forms of the same language." This means that, even if there was an ISO code for British English, editors would nonetheless be required to collaborate on the mixed English Wikipedia.

    I am certain that, of these conditionally approved cases, there was no careful examination of the requested languages, no note taken of what they had already done so we can keep track of their progress and guide them, and no explanation given concerning remaining tasks. These are all absolutely crucial tasks of the subcommittee.

    That is why conditional approval should never be given without prior subcommittee discussion, and *certainly* not secretly with no notification at all. You worry about 'making specific preferences', but that is exactly what we do when we conditionally approve a project that does not qualify for conditional approval.

  4. Bèrto 'd Sèra
    04 October 2007 03:40

    You read a policy nobody is using here, I look at consistency with what we did so far. We have delegated ALL decisional powers to ISO, and I see no reason why we should give lawmaker functions to our Royal selves. We are *not* gods. We can guard the application of a well respected international standard, but we are not supposed to *write* such a standard. There are much better heads than ours doing that.

    Policy should absolutely avoid any subjective decision, in conditional approval stage. WHO will decide if a language is *sufficiently different*? Based on what? How much is *sufficient*? Can you name a scientific unit of measure for *uniqueness* and state a definite threshold beyond which it's *enough*? Can you name a vendor selling *uniqueness scales* and certifying their correct functioning?

    This is ridiculous. The best specialists on this planet come out giving somebody an independent code, but no, four drunken idiots in LangCom make their *own* standard, because "they know better"... pleeeease... let's keep our feet on the ground; it's where they are supposed to be.

    "In most cases, this excludes regional dialects and different written forms of the same language" *In most cases* is NOT a definite statement, but it's rather a wonderful declaration of unrestricted arbitrary power. Our Imperial Majesties decide which cases, based on our Royal Likings. Too bad nobody ever asked us to wear a crown.

    It's time we DO approve a *clear* policy, something people can read and understand. Not just political mumbo-jumbo written for the sake of nicely asserting that "we'll do the hack we please as long as we can fill in TONS of paper you morons won't be able to read in a lifetime". We need to limit LangCom powers to a realistic extent and make grounds for a transparent process (which HAS happened thus far, but it's impossible with the 'policy' you quote).

    So, to official LangCom vote:

    The policy gets reformulated as follows: 1) requests receive automatic conditional approval if they concern a living human language AND have a valid ISO 639-3 code;
    2) requests missing an ISO 639-3 code are refused without discussion; they can be resubmitted as soon as a proper ISO 639-3 code is emitted by the ISO authority;
    3) all cases not fitting in point 1) and 2) require explicit discussion and a vote. Once a request for vote is expressed, a term of 14 solar days applies (holydays included) after which votes are counted. There is no minimal quorum required; the decision is taken by simple majority. If a split result is reached, the decision is for conditional approval to be granted.

    I will count the votes after 14 days, obviously, unless we all cast an explicit vote before that deadline. If nobody votes, my vote wins, so take the time to think about it and express an opinion, especially if you don't like it.

  5. Shanel Kalicharan
    04 October 2007 19:40

    Hello,

    I vote against this proposal. I won't give any specific reasons since everyone else has already covered them. I'm not saying we shouldn't tweak the policy, but this is not the way to go about it.

    For everyone wondering when conditional approval came into existence, you all approved of the Pathoschild-GerardM draft of the policy. Unless you were all asleep or not paying attention, you all agreed to it. If you wish to reformulate the policy so that the only criteria is being a living language with an ISO 639-3 code, then there will be no need for this subcommittee to exist. It is a task so simple that it does not require any decision-making body at all.

    Since this discussion was born of a unilateral action, perhaps now would be a good time to bring something up that I think needs to be said. While we are a subcommittee, who are supposed to be working together, it is clear that we do not. Inactivity due to real-life commitments is understandable and forgivable for volunteers, but the fact is that most of the work is done by one person, Jesse. Even when he left the subcommittee, he was always responding to questions that *we* were supposed to answer, and looking after the upkeep of Requests for new languages. Yes, my judgment is probably clouded by the fact that he is my boyfriend, but objectively speaking, Jesse has probably done more work than all of us combined.

    When we do not work together, how can we ever hope to be productive? If we do things without informing each other, how can we assist the users that need our help?. If people have the impression that we are sloppy, lazy, and incompetent, then I do not blame them.

  6. Gerard Meijssen (GerardM)
    04 October 2007 19:50

    <this user has not agreed to public archival.>

  7. Bèrto 'd Sèra
    04 October 2007 07:25

    Jesse, let's say it clear: if there was an ISO 639-3 code for British English, the Brits WOULD get their wiki. English is NOT different from any other language on earth. We have no noble families and no superior Arian races here.

    If LangCom made an exception for English, have my word that I would put ALL previously approved requests back into exam, to make sure that ALL other languages on earth get as much as English got, no matter the consequences.

    I'm sorry if this offends your native English speaker's sensitivity, but you are not any better than anyone else on earth. Racism is not acceptable, no matter how nicely you phrase it. All linguistic codes have the SAME RIGHTS: what is granted to one gets automatically granted to all.

    If you don't want a British wiki make damn sure ISO doesn't give out a British code. But that's a problem you'll discuss with ISO, not with us, because we do NOT control ISO. If you'd rather be a racist, that's your problem, but I can't see how blatant racism and the WMF mission can go under the same roof.

  8. Sabine Cretella
    04 October 2007 10:53

    Please remeber that also German/Swiss/Austrian German live on one wikipedia. Why not create, like small wikipedias need to do it, pages that reflect the different versions of English (spelling, sentence building etc.)? I mean multilingual Mediawiki will allow for that (and yes, it is only a matter of time to get it), but even if you want to do it now: you can - there are wikis that handle this quite well.

    We cannot make any exception to the rule, otherwise whatever we do will be put into question.

  9. Bèrto 'd Sèra
    04 October 2007 11:06

    Honestly I don't think there will EVER be a separate code for English and German flavors. ISO itself will never want to create such a problem that would not be local to WMF only. The whole world would be in trouble.

    The problem for us is in principle; we cannot make a list of class A and class B languages. This is absolutely absurd. The very same day you make this list you get back your request from Karelians who want all their 3 languages in one single wiki, and got told NO.

    We also said NO in a situation in which it was far more difficult for Karelian people to make a wiki out of 8k native speakers, when in fact two en.wkies would have more than enough public to survive.

    Such things cannot be solved by saying that big sellers (en/de etc) have more rights. We either address them with a policy who fits them all (as MWM would be) or keep shut and simply prey that ISO doesn't issue an ENG-UK code.

    But the solution CANNOT be based on racism. Because that's how just any other language would perceive it, and they would be in their right to think so. We have time, no eng-uk code will be released for years, but if we want a policy that will consent to group two codes into a single wiki we have to think of a solution that will fit ALL.

  10. Gerard Meijssen (GerardM)
    04 October 2007 12:02

    <this user has not agreed to public archival.>

  11. Sabine Cretella
    04 October 2007 12:13

    Hmmm .. it seems you did not understand well: what I said was that there should be no separate wikis (that is there should be only one wiki for German for example), but if the wikis want to create pages of the same article in various ways of writings/scripts etc. they can do it (we already have such wikis). There is nothing against that. What is not feasible is create one wikipedia for example for German German, one for Austrian German and one for Swiss German ...

    For me there are no class A and class B languages, but really only language entities ... since the term language itself always creates problems.

    When we talk about the German or English varieties: they are not different languages, but different scripts, right? So there are no three different languages on one wiki, but one language with different scripts.

    en-GB: centre
    en-US: center

    If Karelian is a language and they have three varieties - eg. one using latin script, another one using cyrillic script or just some different spellings used, there is no reason why they should not be able to live on one wiki. If we talk about completely different languages, then things are different and these different languages should ask for an ISO code. And as much as I remember we talk about different languages here.

    But from a quality point of view having it shown to people that a certain article is in British English and another one in American English (besides that we should not forget Australia and New Zealand) ... that is a help for people reading it, in particular if they are non natives.

    Uhmmm ... if we do not allow for that we have problems also with Chinese ...

    We should take this up again when we have Multilingual Mediawiki around - because that will change quite something.

    We have situations in some languages where one script is imposed (nds for example ... if it did not change lately) - but: there is no "official way of writing" for nds - so all should be allowed. Therefore it would make sense to be able to say: this page is in ortograhpy following Sass and that is in ortography following whoever - nds according to the Institut für Niederdeutsche Sprache has 200 bigger varieties, if you go to "middle" then you have 400 varieties and if you dig really deep ... well I don't have a clue. By not allowing for different ways of writing the wikipedia "leaders" like we have/had them for nds can make sure that the other varieties are excluded ... and that should not be allowed from a WMF POV since it is not neutral at all.

    We will get these kind of situations more and more often and therefore we should consider them - MLMW is probably the solution for this.

  12. Jesse Plamondon-Willard (Pathoschild)
    04 October 2007 19:07

    Bèrto,

    You're contradicting yourself; first you declare that any project should automatically be conditionally approved if they have an ISO code; then (in another thread) you say that, no, we should "discuss those that are not related to living human languages". First you say we should be consistent and fair and transparent and all things wonderful, then you say we should work by precedent and make special cases for non-"living" languages.

    How do we, apparently a bunch of bureaucratic megalomaniac drunks (though I'll let you speak for yourself), decide when a language is "living"? Does that include a modern language spoken by one person, a hundred speaking a revived historical language, or a thousand speaking a fictional language like Klingon, or a million speaking a constructed language like Esperanto? How do we decide which non-"living" languages get a wiki and which don't, if we discuss those in particular as you suggest?

    Applying a policy we all agreed on (or didn't bother to comment on, in your case) is not "racist", particularly since dialect is not a racial issue. Wanting to have one wiki per language (rather than dividing labour among countless dialectal wikis) is not a negative thing. You yourself admit to discriminating against non-"living" languages, so is that some form of "racism"? Please discuss rationally, rather than throwing empty epithets around.

    If we decide (by consensus) to change the policy, than so be it. Applying the policy we agreed on does not make me "racist", but ignoring the policy (as GerardM has done) is the very arbitrariness you constantly rant against.

    I'll start a new thread to discuss your productive point (changing the policy), so you can continue politicking and slinging your rhetorical mud separately.

  13. Gerard Meijssen (GerardM)
    04 October 2007 19:37

    <this user has not agreed to public archival.>

  14. Jesse Plamondon-Willard (Pathoschild)
    04 October 2007 19:52

    Hello,

    I apologize if I responded heatedly; I was rather antagonized by Bèrto's various accusations. My primary argument against conditionally approving this request is that it does not meet the requirements for conditional approval. If we change those requirements (rather than make an exception for this request), I'll have no objection.

    I think it is important that, as much as possible, every decision be objective and in line with the policy we dictate. There may be times when we need to make an exception outside the policy, but those should be rare exceptions rather than commonplace procedure. (Commonplace procedure should be added to the policy.)

  15. Gerard Meijssen (GerardM)
    04 October 2007 22:33

    <this user has not agreed to public archival.>

  16. Jesse Plamondon-Willard (Pathoschild)
    04 October 2007 22:56

    Gerard,

    By "commonplace procedure" I mean any way the subcommittee routinely does something (like conditionally approve a request). An example would be proposing a decision on the mailing list, waiting a specified period, and then implementing it. If this violates our own policy, there's obviously something wrong.

    The way you conditionally approve requests routinely violates our own policy, which is not acceptable. There are two solutions: [1] change the way you process requests to conform to the policy, or [2] change the policy to permit the way you process requests.

    I would personally favour a combination of the two. On the one hand, I think it's a *very* bad idea to change the status of requests in the subcommittee's name without, at the very least, notifying the subcommittee that you did so. On the other hand, the policy was drafted before we had processed a single request; now that we've processed a few, we can use experience and precedent to improve the policy.

    if you have any suggestions for simplifying or streamlining the policy, or removing unneeded bureaucracy, please post them in the other thread ("policy change").

  17. Gerard Meijssen (GerardM)
    05 October 2007 09:05

    <this user has not agreed to public archival.>

  18. Jesse Plamondon-Willard (Pathoschild)
    05 October 2007 13:42

    Hello,

    First, I oppose changing requests' status in the subcommittee's name without at least notifying the rest of us. You don't run the subcommittee, and your decisions are not absolute and above review. Even if they were, keeping us updated on what you're doing in our name is simple courtesy.

    Second, I oppose conditional approval without discussion or notification because this project is not ready. There are several much more developed requests that have been clamouring for recognition, and while they wait you go around conditionally approving projects that have nothing to their name except an ISO code.

    Third, as I've said above, this blatantly violates our own policy. We've (or at least I have) been trying to build a reputation for fairness and transparency, so that users can read the policy and know what to expect and what they need to do. You undermine this reputation when you openly violate the policy and give preferential treatment to projects who don't qualify for conditional approval according to the policy we say we use.

    Fourth, we frequently use the policy to avoid arguing; we say, "you need an ISO 639 code, it's the same rules for everyone, policy is policy." If we routinely make exceptions to the policy, why shouldn't they demand exceptions too?

    Fifth, aside from concerns about conditional approval, you simply don't do it correctly. These projects are never listed on the public log at <http://meta.wikimedia.org/wiki/RFL#Approved>, unless I do it. You don't note their current progress so we can easily track their status; next time we look at each project, we'll have to research it completely all over again to find out where it is now. You don't provide any explanation about the remaining tasks; when asked, you give a response like "do the localisation either [in the Incubator] or in BetaWiki", despite the fact that they likely know nothing about either project (not where they are, how to request translator rights, nor how to localize), and despite the fact that they don't even have an active test project yet (which should be done before we tell them to slog through endless hours of localization).

    The normal procedures to do the above are documented at <http://meta.wikimedia.org/wiki/Special_projects_subcommittees/Languages/Handbook>. We won't encourage development by being opaque or coldly bureaucratic. If you're unwilling to do all the steps, just propose the decisions on the mailing list and I'll find time to do them.

  19. Gerard Meijssen (GerardM)
    05 October 2007 14:32

    <this user has not agreed to public archival.>

  20. Jesse Plamondon-Willard (Pathoschild)
    05 October 2007 14:46

    Hello,

    Oh, never mind. I won't even bother arguing, I'll just fix your mess.

  21. Shanel Kalicharan
    05 October 2007 14:52

    Gerard,

    At the very least, it makes life harder for all of us when you don't tell the communities what they need to do for final approval, and when other people come running to us, wondering why we are being so unfair and biased.

  22. Gerard Meijssen (GerardM)
    05 October 2007 15:12

    <this user has not agreed to public archival.>

  23. Bèrto 'd Sèra
    05 October 2007 17:25

    LOLOLOLOLOLOL!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

    Are YOU telling me that all that is happened is that Jesse got nervous because us idiots forget the fill in the expected fields in the blanks?????????

    Then why the hell don’t you SIMPLY SAY SO, instead of calling in the Relativity Principle, Sacred Policies and God knows what else???????? I mean, not sure we can use just ONE code for ENG, after this… really not sure. We appear to be very little mutually understandable here.

    Now, if THAT is the problem, than JESSE IS RIGHT, as far as I’m personally concerned, at the very least.

    Here’s a concrete proposal. The whole Meta elephant was conceived when we had mob fights to decide which is which. What we are doing now is MUCH better suited for a ticketing system; where one has those gracious drop-down menus that for everyone’s information are used even in the Ukrainian supermarkets, with WMF appearing to be the only place on earth where order processing is based on a wiki.

    We do have a ticketing system in WMF already, all it takes is to move the entire procedure there and make it self-documenting. So morons like me won’t forget to add the menu voices that are needed, for the simple reason that they won’t be able to commit the transaction until they do so. And Jesse won’t have to waste hundreds of hours trying to understand what idiot forgot to write what, when and where.

    Does THIS make sense to us all? Now don’t tell me that a ticketing system is wrong “for transparency”, because if you do I’ll ask you why we are using it since ages for every other question/answer procedure we have. Besides, I do believe that the ticketing system can mail us a link to open tickets, so we can really speed up the procedure, instead of producing tons of useless bureaucratic stuff and computing numbers with hand-held abacuses. We are supposed to be an IT based project, remember?

  24. Bèrto 'd Sèra
    04 October 2007 02:35

    Hi,

    One more thing: it would be stupid to give excessive importance to the count of people who 'claim' to be native speakers. It would only lead requesters to invite as 'native speakers' all of their friends and their pets, as it used to happen when things were decided by public voting.

    It's better if we avoid criteria that only call for fake data to be inserted. If they don't have native speakers they'll make a bad incubator project, and it will show pretty clear. It's better to judge based on actual results, than it is to judge based on claims that nobody can verify anyway.

  25. Sabine Cretella
    04 October 2007 04:51

    Sorry, but I don't understand this - the language exists, he can do the localization, he can work on incubator and by that way create the community he needs for the wiki. So where is the problem? If the conditions are met at a certain stage the wiki will be created - that's all.

    Conditional approval does not mean we will approve with only one editor - it says that the request could become a standalone project once our conditions are met.

    The whole story of conditional approval was invented at a certain stage and I find it quite stupid, since all what we need could be included in the original request

    1) no of editors
    2) localization of UI
    3) contents on incubator
    4) ISO 639-3 code present
    etc.

    In the end this additional stage is just an additional bureaucratic step I find quite ... ehm ... better not to tell you. The only thing that can immediatly happen is the rejection of approval when ISO 639-3 is not met.

    So now we will need to write 20 mails about one language where things are obvious in future? I don't see the necessity - and I don't see how we can decide what a language is and no - if we do that we get into endless fights. We can adopt a standard which we did, languages who believe they are languages and can have the ISO 639-3 code can apply for it (that is not our job - ours is to verify what is there).

    I believe it is time to review our policy in that way and completely exclude this step of the so-called conditional approval (sorry, I hate it to do double work).

    Btw. no, I am against revoking the conditional approval of Tachelhit Wikipedia.

  26. Bèrto 'd Sèra
    04 October 2007 05:09

    Hi!

    The only use for conditional approval is telling requesters that it makes sense to start UI localization OR to get an immediate speedy reject for unfit requests.

    I would maintain the fact that we do immediately close requests that have no ISO code and discuss those that are not related to living human languages (klingon has a valid code, historical dead languages also do). Within this framework IMHO it makes sense to invest time on requests.

  27. Gerard Meijssen (GerardM)
    04 October 2007 06:11

    <this user has not agreed to public archival.>

  28. Bèrto 'd Sèra
    04 October 2007 18:26

    Moreover, talking 'bout why it's stupid to make "uniqueness assessments"... first we write a policy that creates a stupid problem (we may want to have languages with no ISO code) THEN we write an entire bureaucratic procedure that is absolutely useless, because there are NO such cases in ISO 639-3 (unless someone thinks that Dutch and Afrikaans should be united).

    That's how bureaucrats eat up the planet. They invent problems out of thin air and use them to justify their existence and their arbitrary power, instead of using clear rules that exclude problems in principle and make the process quick and transparent, as in a self-service restaurant. In Italian this is called "Department for complicating simple affairs".

    ISO 639-6 will certainly reintroduce the problem, but since building wide acceptance around a hierarchical classification of that size will be *really* hard, I don't think we will have to confront this problem in our lifetimes. It will take decades before such a giant tree is checked and accepted enough to become generally authoritative, and by that time the current network technology will have evolved into something completely different. Anyway, even then it will take a vote and a serious preparation, before anyone switches to ISO 639-6 or any other standard.

Policy change[edit]

No consensus was reached on a policy change.

Bèrto-Pathoschild proposal[edit]

  1. Jesse Plamondon-Willard (Pathoschild)
    04 October 2007 19:33

    Hello,

    Bèrto has suggested (carefully hidden in a pile of rhetorical mud) a change of policy. The essence of his suggestion is to move most requirements in the policy <http://meta.wikimedia.org/wiki/WM:LPP> to the final approval section, remove the limit on mutually-intelligible dialects with language codes, and add an arbitrary subcommittee decision for non-"living" languages. He also suggests making conditional approval automatic for languages with an ISO 639-3 code.

    First, there are a few points to clear up.

    • What is the objective definition of a "living" language?
    • Is 14 solar days a reasonable period for a decision? Could that be shortened? Previous decisions have been taken with 24 or 48-hour notice, with an extension if anyone needs more time to formulate objections.
    • Why would we use solar days? Will any of us be measuring the angles of the sun, or implementing every decision at sunset? The result would be rather close to the analogous clock period anyway.

    If conditional approval is automatic, we should instead scrap the conditional approval step and make it truly automatic: change the header template to include a checklist of requirements to be met, with an ISO code right at the top. This has the added advantage of being easier to understand for requesters than the blank "CONDITIONALLY APPROVED" stamp GerardM has been passing around.

  2. Gerard Meijssen (GerardM)
    04 October 2007 19:43

    <this user has not agreed to public archival.>

  3. Bèrto 'd Sèra
    05 October 2007 04:38

    Well, an unquestionable credit for Jesse is that we finally will do what we should have done long before: using the policy to “describe what we actually do”. Thanks for this :)

    BTW Jesse, I do find some of your phrasing offensive to people and potentially very dangerous if picked up by hostile media. We already receive LOTS of private pressure by more or less delirious users who are convinced of all sorts of conspiracy theories, we already have a world-wide “frenzy” about the “bureaucrats taking over wikipedia” (see the last wave of fake articles about “quality checks”). I’m also positive that you simply have no clue at the way in which millions of people can interpret certain phrases, but this doesn’t make them any less dangerous.

    Please take a look at reality (I mean exactly THIS; it’s not a personal offense 4 anyone). Reality is easily extracted by the history of our precedent decisions. This is the 'consensus' we have reached so far, not by discussing theoretical procedures, but by field work. It also makes a moral obligation towards all requesters, since what has been granted to one cannot be refused to another.

    When we say “living language” we so far defined a 3 element set:
    1) living, i.e. used in everyday communication
    2) classical, not used in everyday communication, yet taught in schools as a means to access a vast corpus of classical knowledge in its original form
    3) historical, an old language not used anymore in everyday life, and that has no internationally accepted use as a classical language

    We rejected all applications for projects belonging in class 3), see Ottoman Turkish and a few others. Such languages are easily identified, as they always are deeply outdated forms of existing class 1) languages. Our mission is to grant an encyclopedia to people, so class 3) simply has no people to which deliver content, if you don’t count a handful of scholars. I’d say that class 2) is pretty much in the same condition, if we are to be logical, yet so far class 2) has been granted wikies, and we must maintain consistency. So 1) and 2) have ALWAYS been conditionally okay “in principle”. Incidentally, this means that today we would NOT approve an Anglo-Saxon wiki. In instead, we would suggest they build a wikisource, which is a far more appropriate use for such languages. I find “racist” this disparity among 1) and 2), but since it doesn’t hit any living being I won’t spend a second to worry about it.

    Now... by "solar day" I mean a period of 24 hours, as opposed to 'working day', which also needs not to be a holyday. We all live in different countries, defining a 'working day' is impossible. Allowing two weeks would bridge almost all possible local holydays without the need to ask for extensions. If and when all valid votes are cast before the 14 days term the decision is final, so making things quicker only depends on us. So much for definitions. Now let's get to the meat.

    If we leave alone all the fuss about uniqueness as a “principle” and get to treat it as a practical problem, I’d be glad if anyone pointed out possible collisions between ISO 639-3 codes. Even in a “production oriented” framework that wants to protect major fundraisers (like en and de) there is no such ISO code that can damage a major fundraiser for us to treat. ISO guidelines do NOT lead to such problems. That’s unless you rate that a Vincentian Creole English wiki could undermine en.wiki, yet stopping an SVC request as non-unique after the approval of Cajun would be a major disparity in treatment, so it cannot even be taken into consideration.

    In instead, Trad Chinese and Simplified Chinese coexistence has proven an immense source for political trouble, and the same is happening with Korean vs Hanja, as we all know. Vietnamese may start a similar war. We do have a serious problem with sc.wikipedia.org, were all “legal” (that is, those corresponding to the description given by ISO code SC) linguistic forms are forbidden, while a constructed language recently invented by the local govt has been made compulsory by the admins. The planet is far from being peaceful, but this has never shown any relation whatsoever with ISO guidelines for ‘conditional approval’.

    The actual existing problem is rather to provide an environment in which different cultures/scripts/flavors that are undergoing structural tensions may merge without excessive conflict. I doubt this can be done at all. We have a bit more of chances to act with sc.wikipedia, since the precedent with be.wikipedia can be used to request Board intervention. All other mentioned problems need tools to be addressed. All we have being doing thus far is “playing dumb” with such requests and hoping that MultiLanguageMediaWiki will come soon. The only impression we gave is that “we don’t give a damn”. It’s not true, but that’s how it got rightfully perceived, given the actual results. Possibly we might open a section “needs MLMW”, put there a template saying “go bug your devs” and just get rid of it. At least it would be honest, much better than making promises we cannot keep.

    In instead, we have a problem with macrolanguage ISO codes (as sc = srd).
    See
    srd Sardinian as composed of
    1) sdc Sassarese Sardinian
    2) sdn Gallurese Sardinian
    3) src Logudorese Sardinian
    4) sro Campidanese Sardinian

    My take is that a macrolanguage should NOT have a wiki on its own. It’s too much a risky move. We could automatically redirect all such requests towards a ‘needs MLMW’ section and grant wikies ONLY to ground-level codes. If and when we have MLMW, people looking for trouble will have enough room for them to build independent social networks, instead of colliding like crazy. In the meantime, it doesn’t make any sense to spend more server space for “language fight clubs”. This also may be a future solution to make better projects in the incubator, if and when grouped languages are at least partially mutually intelligible. This would be good for our own “industrial metrics”, to show that the wiki has more articles/active users, yet it remains to be seen whether this also could deliver an encyclopedia that people can use as a source of information. I’m quite doubtful about it.

    As per routine “conditional approval/speedy denial” work, I’m fine with what has happened thus far. We seem to be many, but actually we are but a few and all have very little time. Moreover, almost none of the requests we get is any better than a one-man band attempt with no real support behind it; once they are given the opportunity to work they simply fade to grey. So I’d rather invest time in examining all together (as we did so far) the actual results of localization and incubator work, if and when a team reaches that point. “Conditional approval”, as I suggest, can be simply given out as a notary step, at least to wikies, to stop wasted space/work on non-ISO requests.

    So far all non-wiki requests weren’t community start-ups. When someone asks you a wikibooks project, a WMF community for that code already exists, and it’s quite easy to factually determine the human horse power behind such a linguistic area. It only takes a brief report to assess the potential, as we just did with Shanel’s 4 cond-approvals. While anyone of us can take the liberty of giving cond-approval/denial at any time based on ISO guidelines, it’s much better to keep LangCom current about it. Yet, I’d rather thank who did some work in August than opening a trial against him based on “formal violations”. I find such an attitude really destructive. People’s time MUST be respected.

    I can’t understand exactly what Gerard means by “a very good reason”. I will personally never accept ANY exception, no matter the reason. If ISO is wrong, then we all go to ISO and push them to correct their standard. It’s what we tell to all our requesters when they have issues with it, and we are no better than any other user. Until ISO doesn’t say it’s okay, my vote is a NO anyway and I will very vocally oppose any exception at any level. Rules apply to all and everyone, and they apply to US first. The only rule needing an exception is a badly formulated rule, at that point, you don’t make an exception, you take the time to better formulate the rule.

    There is no actual rule for final approval, though. All situations are too different in size, available means and context for us to define any rule leading to a final green light. We had a brilliant Kabyle project, are now seeing a very weak Sakha incubator. It doesn’t look like a project I’d give a green light to. Sakha has internet media, including newspapers and online radios, it seems quite evident that this is the work of a single person without any real community behind him. Sakha is not a vocal-only language that may justify such poverty of written production. Anyway, it’s also mostly located in very badly connected areas of Siberia, whereas Kabyle have a strong French based community. They might benefit from advice in marketing their project, but I don’t think that what we saw during the last screening can be released as a stable wiki.

    How can you make a rule to manage this without simply writing lots of empty blablabla remains a total mystery. In the end the one and only rule is: “we either like what you did or we don’t; if we don’t you keep working until we do”. That’s a HONEST formulation. Everything else is good-looking political bullshit put there to hide the arbitrary power we DO have. One point in the current procedure that I find really weak is that no exponent of the community takes part in the assessment. Let alone defending their work (which nobody could do thus far, and this IS unfair) we end up in delivering a “judgement in value” instead of a practical advice. People get a Good/Crap signal instead of an evaluation of strongholds vs weak spots in their work. This is something we might want to work on, because it’s really too hierarchic as a form of communication, and it’s made even worse by coinciding with a phase of totally arbitrary decisions.

    Finally, most of us have little or no time to work on meta. It *might* be worth thinking of a public mailing list for this activities. It would help beginners in getting to know and support each other. It should be made at least in a few main major languages, as we deal with local cultures that often don’t have English as a standard means of international communication. It’s not something WE should use, it’s rather something conditionally pre-approved requesters should be granted access to. It may save us LOTS of time and help people much better than we can personally do. It’s a wiki, it should be THEM doing the job on their own, whenever possible.

  4. Gerard Meijssen (GerardM)
    05 October 2007 07:22

    <this user has not agreed to public archival.>

  5. Bèrto 'd Sèra
    05 October 2007 08:10

    Hoi,

    Well, you need a finite limit before counting votes. Otherwise you get to the point in which most people didn’t vote (it’s always happened so far) and anyone can claim that those few who voted are “forcing a POV”. So in all kind of boards and elections you do have a limit beyond which votes are not valid anymore. It’s a commonly used formal barrier.

    Anyway, pls note that we are referring to cond-appr only, so:
    1) historical languages
    2) Constructed
    3) Non-wiki projects (which is going to be 99% of the process volume)

    Living language wikies get cond-approved/denied instantly by ISO proof, as soon as someone has the time to review open requests and flag them as per policy. No vote is required for that. This means that in time people will get the message and we should have less and less speedy denials for absence of ISO code.

    The 14 solar days limit is referred to what happened when quick decisions were made in the past and some members complained they didn’t have the time to review the proposals and cast a vote. Since we all have a very loaded schedule and some of us also travel, it makes sense to allow time for everyone to make it. If all vote before that (it will never happen, based on past experiences) the vote is closed immediately.

    ====

    Assessing a localized incubator project IS different and it cannot be done in a finite amount of time. As said, there is but one rule for ‘final’ assessment, and the rule says ‘we do as we please’. Just finding a native speaker who can verify that content in the alleged XXX code really IS in the XXX code may require more than a month, depending on the location involved. Not all will cut and paste content from languages one of us speaks (as it happened with Sakha), so verification is necessary and the chance that we will need external expertise will grow in time and complexity as we reach more remote locations. Stating a finite amount of time for this phase is simply unrealistic. It may be 5 minutes or 3 months, depending on such factors. Once assessment is over and we are ready to vote a decision, the 14 days term may apply, once again, for formal reasons.

    I think the preparatory discussion needs NOT having an envoy from the community. We can assess things much more calmly in private and reach a consensus before we meet them. Nevertheless, it would be very unfair not to have at least one person from the project meet us ‘personally’ to receive the results of the assessment and discuss them, for example on skype, IRC or whatever, so that we can tell them what we think and he/she has the chance to explain their POV and request explanations about ours. Once in a while we’ll run into a madman, but most of the time I believe this can actually help projects in getting better and us into understanding their practical problems.

    After all, the frequency of projects that are claiming to be ‘ready’ is low, and the remaining unused wiki codes deal with communities that have increasingly lower probabilities of successfully building a project because of low connectivity/number of native speakers etc, so it’s not going to be a big burden. In instead, it can help us in building good relations and a number of “case studies” that can serve as a base to develop some sort of publishing policy for small communities.

  6. Gerard Meijssen (GerardM)
    05 October 2007 09:00

    <this user has not agreed to public archival.>

Bèrto proposal #2[edit]

  1. Bèrto 'd Sèra
    07 October 2007 23:39

    Hoi,

    • When it is proved that what is presented is not that language
    • like in the case of Karelian we will block anyway.

    I’d rather put it in the «not yet» section and require they become consistent with their ISO code. I guess the difference is only in the way you word it.

  2. Jesse Plamondon-Willard (Pathoschild)
    08 October 2007 01:28

    Hoi,

    That's already implicit in the policy; I have no objection to that principle.

  3. Bèrto 'd Sèra
    07 October 2007 23:36

    Ooookay... I’ll stay up one more 30 minutes and try to make some order in my thoughts.

    My first impression was that Jesse had come over here to start a quarrel based on principle. I erupted at that. Then it dawned on me that Jesse had actual reasons to be annoyed, since he’s one of those (if not the one and only) who cleans up the crap resulting from chaos. On the other hand, I still think that Jesse misses a few points, since we have worked thus far based on precedents of achieved consensus, NOT on formal policies.

    We have two major problems : 1) we have a policy written BEFORE actual work was made, consensus that generated in time moved away from the literal text of the policy, but the policy never was updated
    2) we use IMHO the wrong tool to deliver the service of conditional approval, thus generating LOTS of useless work that Jesse (or anyone else) has to perform in the uncomfortable condition of having as a sole reference an outdated policy OR being a telepath OR reading all the LangCom acta to keep himself current.

    This is stupid and it leads to frictions. When annoyed people meet each other all they do is fight, instead of helping each other and the projects out there.

    So, IMHO we need both to change the policy AND to use a self-documenting tool to perform cond-approval. This will free resources for better use, since we have fairly little people and time BOTH in LangCom and outside of it. We also need to understand the old principle that “good fences make good neighbours”.

    My proposal is that we lift all undetermined conditions that require a subjective discussion from cond-approval. Cond-approval ONLY will be meant as a warranty for a project that LangCom will not object to their request IN PRINCIPLE, i.e., based on ISO code absence and project feasibility.

    So, we can finally say that a project is feasible if: 1) it has a valid ISO 639-3 code (and thus it is considered «a language» and «sufficiently unique» at international level)
    2) it is requested for a «living human language» or a «classical language»
    3) it is a request for a wikipedia (which is the most common case we get)

    «Historical languages» (as outdated forms that are not considered as «classical» according to a list to be defined, but surely containing Latin and Greek, for example), are rerouted to wikisources as they don’t have a living audience that may use an encyclopedia.

    Everything that does not fit in the previous rules is to be assessed by detailed analysis.

    Tool : I propose we receive and cond-approve requests on the OTFR tool. There might be problems in doing so, but we really need to lift some of the bullshit from the meta guys’ shoulders. Once a project is cond-approved, it DOES make sense to open a meta section NOT for useless theoretical discussions, but to let volunteers help the projects and provide much needed practical guidance. I still think that MW is a bad Forum, but people requesting a wiki are spposed to learn to use one, so it’s nice to keep them on MW for service communications.

    So, in steps: 1) We open a page pointing the guys to the OTRF
    2) On the SAME page we publish a short guide telling them HOW TO FIND THEIR ISO CODE
    3) We analyze the requests and evaluate the ticket
    4) If the ticket is approved we
    a. send a request to Betawiki to open a new localization
    b. open a page on meta that is a status page on which volunteers can help and track progress as they pls (no control on their format/duties from LangCom on this)
    c. whenever the project whishes they can request final approval by opening a new ticket

    This second class of ticket has only 2 possible answer :
    1) yes
    2) not yet

    LangCom has LOST the chance to say no when it gave cond-approval, from there on it can only say that a project is still too weak, not that it’s wrong. A «yes» doesn’t need motivations, a «not yet» does. When LangCom decides that it’s too early for a release, we invite a representative elected by the community and we discuss our POV with him/her, explaining what we feel that should be added.

    What we actually state is that :
    1) the articles look like wiki articles (markup has been learned to a decent level, they can put in pictures, have interwikies, can use titles and sections, etc)
    2) the language used for that wiki has been confirmed to be the language described by the ISO code associated with the domain, either directly by one of us or by an independent third party.
    3) UI has been localized to the required minimal extent.

    There’s nothing else we can analyze with chances to be true. I mean, we can read the hystories and see how many people make edits, but that’s... I have a floating IP and can use 12 of them from different networks, if I want to. It’s stupid to put in conditions that will only favourite those who can «hack» their way thru the policy. You’ll end up easily in passing a clever asshole faking 12 users and stopping an edition made by 3 real guys.

    It’s not supposed to be the Bible. Take your time and think how to make it better. But pls, no theory. Let’s keep our feet on the ground. It’s only a job, let’s make it easy and idiot-prrof.

  4. Bèrto 'd Sèra
    07 October 2007 23:42

    Sorry, this was badly worded

    So, we can finally say that a project is feasible if AT THE SAME TIME: 1) it has a valid ISO 639-3 code (and thus it is considered «a language» and «sufficiently unique» at international level)
    2) it is requested for a «living human language» or a «classical language»
    3) it is a request for a wikipedia (which is the most common case we get)

  5. Bèrto 'd Sèra
    08 October 2007 04:07

    I thought I’d sleep, but I suddenly woke up remembering that there’s another class of requests the old poliicy wasn’t addressing at all: alternate localizations.

    Even if and when we avoid releasing wikies for macrolanguages, there are a number of languages that have flavs so distant from each other that a single localization may become problematic to use. We already have that trouble for EML and LMO, in Italy, many more are to come, I’m quite sure.

    So... if we are okay with IANA codes being used to identify message files, we can open a third class of tickets. For these requests we :
    1) point users to a resource on which they identify an existing IANA code
    2) point users to a list on which they can request one

    Once they have a code, they can submit their ticket. There’s not muh we need to evaluate here, apart from checking that the code really has been issued. If that is the case we request betawiki to open a new message set for them. Meta is not involved in the process at all, I would say. Maybe we don’t even nned to check how nice the localization is, since it’s not the default localization anywhere. Anyway, even if we do we can simply issue a list of messages to localize before betawiki commits the new file, so we keep the workload at an absolute minimum, since it may be many such requests.

  6. Jon Harald Søby
    08 October 2007 05:30

    I think we don't really have anything to do with who can and cannot create localizations; if someone wants to create a MediaWiki localization for leet, Pig Latin or Elmer Mudd, that is none of our concern. The only thing we can do is say yes or not yet to a project based on whether or not they have a localization ready.

  7. Gerard Meijssen (GerardM)
    08 October 2007 05:38

    <this user has not agreed to public archival.>

  8. Bèrto 'd Sèra
    08 October 2007 15:00

    So okay. We delegate our code-related approvals to:
    1) ISO 639-3 (domains naming)
    2) IANA (language subtags)

    This is for ALL that concerns LangCom decisions. Whatever people can/want to do outside of our scope of decisions, it’s none of our bag.

    Does this make any sense?

  9. Gerard Meijssen (GerardM)
    08 October 2007 15:11

    <this user has not agreed to public archival.>

  10. Jon Harald Søby
    08 October 2007 15:49

    2007/10/8, GerardM <gerard.meijssen at gmail.com>:
    {{ls-censored|quote}}

    I have no idea of what you are trying to say here. Can you please explain more? The sentences don't seem to have anything to do with eachother.

  11. Bèrto 'd Sèra
    08 October 2007 22:03

    Wait. Quality of a project is assessed based on facts, that is, AFTER code evaluation, when a project asks to be officially released. Wea had this stated in the part in whcihc we evaluate the “quality” of the work, i.e., whether they can properly use MW to produce meaningful material. Is this what you mean?

    The aspect with localization is different in that it deals with already existing projects, as an ALTERNATE localization. Should we check the quality of these, too? If so, how? To further clarification, we are not controlling the individual right for people to produce alternate language packs. What we can control (if we are to be realistic) is the access to Betawiki and the inclusion of such language files in the official MW distro.


  12. 08 October 2007 23:46

    <this user has not agreed to public archival.>

  13. Bèrto 'd Sèra
    08 October 2007 23:57

    Okay, I'll better explain the practical problem from which all this starts.

    We have a number of wikies who want to have different localizations WITHIN a same ISO 639-3 language code. Say, LMO or EML. This is frequently leading to "secessionist" request and our refusals to allow indipendent wikies is leading to frictions with people who have a practical, rather than political, problem.

    If this friction grows the problem becomes political, as they think that LangCom is condemning them to use "another dialect". They also start to have internal fights to decide what dialect is best, and this gets quickly political, too. My aim is to keep practical problems on practical grounds.

    So, what I suggest is that we accept a request for an alternate language pack, so that a registered user may decide to use a dialect or another for his/her interface. And that we do delegate to IANA the decision on whether the request ic correctly formed or not. If IANA feels they are qualified to get a code, this code may identify the massage pack, and thus the alternate version is included in the "language offer" to registered users, with the "official dialect" as a fallback language.

    It sounds complicated, but it's a real problem, as many ISO codes include under one code dialects that are only mutually understandable to a limited extent, and this cannot fail to create frictions.

    Again, if anyone has a better solution, just propose it. The aim is to solve THIS usability problem.


  14. 09 October 2007 00:14

    <this user has not agreed to public archival.>

  15. Gerard Meijssen (GerardM)
    09 October 2007 04:52

    <this user has not agreed to public archival.>

  16. Bèrto 'd Sèra
    09 October 2007 04:56

    Well, what happens AFTER approval is largely out of our control. We can present the case to the Board, but nothing keeps people from «behaving» in the incubator, to go nuts just a few seconds after they got released as a domain.

    Anyway, I’m definitely in favour of any recognition being given by an international entity outside WMF. It’s not our business to decide what is in a standard and what is not.


  17. 09 October 2007 16:43

    <this user has not agreed to public archival.>

  18. Gerard Meijssen (GerardM)
    09 October 2007 20:13

    <this user has not agreed to public archival.>


  19. 09 October 2007 23:05

    <this user has not agreed to public archival.>

  20. Gerard Meijssen (GerardM)
    10 October 2007 10:49

    <this user has not agreed to public archival.>


  21. 08 October 2007 23:55

    <this user has not agreed to public archival.>

  22. Bèrto 'd Sèra
    09 October 2007 00:00

    That would be fine for me, since it also has to describe a content WITHIN a span tag (localized strings).

    It doesn't help much with articles written in this alternate dialect, but I'm afraid we shall wait for MultiLanguage MediaWiki, to achieve that. And Programming is out of our control.

    Anyway, it would make sense to choose a standard that can fully qualify the page, too, once MLMW is out and usable.


  23. 09 October 2007 00:18

    <this user has not agreed to public archival.>

  24. Bèrto 'd Sèra
    09 October 2007 00:57

    Hi!

    Just any language. Let's make a practical example. The EML entity is made of a huge number of sub-enitites, some of which are so different that they use a different script (thank God, they all are latin, yet the same sound may get fairly different representations). I will now INVENT codes that only make sense to illustrate the problem, I'm not suggesting they should be used in real life.

    EML, as such, does not exist in nature. Depending from the city you go, you'll get another flavour. So...
    EML-bo is from Bologna
    EML-ar is from Reggio Emilia
    EML-mo is from Modena
    EML-ro is from Romagna and Northern Marche (two regions)

    The EML-ro/rest of the language roughly marks the borders between the old Exarcate of Ravenna (the Byzantine domain in the VI-IX century) and the Longobard Reign. A similar distinction arises in LMO, following roughly the borders between the Marquisate of Verona and the Reign of Lombardy (see map http://en.wikipedia.org/wiki/Duchy_of_Spoleto where the exarcate is called Pentapolis). It is important to understand how old this borders are, to appreciate the amount of "drift" they originated.

    Now, when we unite these languages under EML (which really makes sense, because they are extremely close to each other from many a POV, we don;t solve the fact that many an everyday expression used by one language is not understandable in another. So, translating "edit" into a single common (koiné) word may be impossible, each dialect may need a different word to assemble a usable UI (which IS a basic asset to market an edition).

    My request is that we do not start to invent "private codes", but have a well formed standard in instead, while allowing practical flexibility for editions who need it. It doesn't matter which standard we choose, IMHO all we need is an external internationally recognized entity that will issue the codes (if needed) and accept/refuse such requests, so that once again we only have a notary function in the process.

Pathoschild #2: historical languages[edit]

The language proposal policy was slightly modified to require a sufficient number of "living native speakers" rather than "interested editors".

  1. Jesse Plamondon-Willard (Pathoschild)
    17 October 2007 06:44

    Hello,

    I propose changing the language proposal policy to limit new wikis to languages with living native speakers, per previous discussion and practice. This is logical because the Foundation's goal is to provide knowledge and resources to every living person, which is not furthered by spreading its contributors among languages that nobody natively reads anymore (nor is it our mission to promote extinct languages).

    I suggest also applying this restriction to Wikisource wikis. While it

    • is* within the Wikisources' missions to host published content in

    extinct languages, Wikisource wikis are by their nature very flexible with regards to having multiple variants of a language.

    For example, the English Wikisource accepts all variants of English, including Old English which is completely unintelligible to modern English readers. An Old English Wikisource (pre-langcom) was recently closed because contributors of Old English content simply did not split from the main English Wikisource.

    <http://meta.wikimedia.org/wiki/WM:LPP>

  2. Gerard Meijssen (GerardM)
    17 October 2007 06:48

    <this user has not agreed to public archival.>

  3. Jesse Plamondon-Willard (Pathoschild)
    17 October 2007 06:55

    Hello,

    I've provided some reasons not to make an exception for Wikisource based on my experience as a Wikisource editor, mainly that it is easy to include multiple variants in a single Wikisource wiki. Could you explain why you disagree?

  4. Gerard Meijssen (GerardM)
    17 October 2007 07:15

    <this user has not agreed to public archival.>

  5. Jesse Plamondon-Willard (Pathoschild)
    17 October 2007 07:42

    GerardM,

    See my proposal above for dealing with historical languages with no modern equivalent.

    Combining extinct variants of a language on one wiki is not artificial or wrong; the interface and project documentation need only be in the modern language, so the subdomain is for the modern language. The only content in the extinct variants is the original works, and these can be tagged accordingly. Hosting all variants is already routine for Wikisource, on both the English and French subdomains and probably most others.

    Latin does not seem to have any native speakers today, although it is used in traditional ceremonies and can be learned alongside other ancient languages. ISO 639-3 classifies it as "ancient", which it defines as "went extinct in ancient times (e.g. more than a millennium ago)".

    <http://en.wikisource.org/wiki/Category:Old_English_works>
    <http://fr.wikisource.org/wiki/Aide:Typographie_de_l'ancien_français>
    <http://www.pathos.ca/tools/ISO639DB/index.php?search=lat&exact=true>

  6. Bèrto 'd Sèra
    17 October 2007 06:54

    Hi!

    It would definitely make sense to group all historical versions of a single language within a single container. It would also make for better visibility, since the people would have access to a wider scope of documents.

    However we have a number of practical catalogue problems with this:
    1) there is a chance that one historical languages is "shared" among a set of modern cultures, like latin.
    2) as it's often the case, we miss Multilanguage Mediawiki. Until we have this, it gets impossible to properly catalog content by inserting it in a single wiki.

    Point 2) is (hopefully) temporary in nature, yet probably point 1) will happen a number of times. The Ottoman Empire gave way to a number of independent states, the Azeri, just to name one, may be just as interested to Ottoman Turkish as Turkish proper.

    We might need a way to deal with this by automatically expose content in "related languages", but I can't see an immediate solution. I'm afraid that having independent wikisources is at the moment the less dangerous solution, although I share the approach towards extinct languages.

  7. Jesse Plamondon-Willard (Pathoschild)
    17 October 2007 07:02

    Berto,

    I think language tags can be applied to any container on a page, so it would be possible to designate a different variant by doing something like <div lang="ang">Old English text</div>. Karen can probably confirm this; if so, language tagging is not an issue.

    You do raise an interesting point regarding Latin; there would be no modern equivalent for it to share a wiki with. Maybe we could make an exception for such languages, while variants like Old English share with their modern equivalent like English?

  8. Bèrto 'd Sèra
    17 October 2007 07:44

    Hoi,

    Gerard, I honestly doubt that there is a single family on earth raising children as native Latin speakers. Anyway, this is once again a single case, we need a formal policy. Problems with uniting a historic language with a later version may be everywhere.

    Ottoman Turkish in itself is a good candidate for an Azeri and Turkmen, let alone the historical importance it has for a number of Arabian countries.

    I wonder if Classical Tamil can have such a situation. Old French is definitely going to be of interest for Cajun French. Anglo-Saxon is probably going to be of importance for people making studies on Frisian, too. I'd say Old English should already be "local", and including it in en.wikisource should be safe, but you might want to hear different opinions on this issue. Old Celtic wikisources, if any, may benefit from a merge to allow compared studies.

    IMHO, wikisource should be ONE, like commons. What we would really need is a way to correctly tag content AND make a proper exposure of related cultures, languages and periods.

    It's a public library; it should give people to opportunity to spawn over linguistic media, not only on subjects. I'm afraid this would require a serious modification of the underlying software, but if you are into historical sources there's no way for you to study a period without making such comparisons.

    People making classical studies, for example, will often want to see different versions of Plato, in Arabic, Greek and Latin. Many ancient texts passed thru a number of versions to later get once again imported back in their original environment.

    I'm happy with the fact that we are going to have the chance of properly tagging pages. So why don't we consider asking all wikisources to merge? But we still need to suggest a way to properly evidence related material.

  9. Jesse Plamondon-Willard (Pathoschild)
    17 October 2007 07:53

    Berto,

    Wikisource explicitly chose to split; there was originally a unified Wikisource. There was a lot of discussion about it, but the overall consensus was that language subdomains were better for Wikisource's purposes. The unified Wikisource is now used as a repository for texts with no adequate subdomain.

    <http://en.wikipedia.org/wiki/Wikisource#Language_subdomains>
    <http://wikisource.org>

  10. Bèrto 'd Sèra
    17 October 2007 08:05

    If one can split, all can split. Then it should be up to single projects to decide whether they want to exist as specialized branches of an existing wikisource or go their own way.

    It's sets of humans we deal with. They may not like each other, or have just any issue. If we talk content, than I said my POV, but if we talk humans then it's none of our bag to tell them who they should work with.

  11. Jesse Plamondon-Willard (Pathoschild)
    17 October 2007 16:46

    Berto,

    Your suggestion that "if one can split, all can split" isn't applicable to my proposal, since I did not suggest that any variant be allowed to split. Historical languages with no modern equivalent could have their own Wikisource, but there's nothing to split them from. That's why they have their own.

    Whether or not individuals "like each other" isn't very relevant; they can collaborate even more easily than speakers of different English dialects do on the English Wikipedia. Published works are less collaborative; you can't change the text just because you don't like the point of view or dialect in it. A text can easily be uploaded in every dialect it was ever published in without conflict.

    There are 25 variants of English with an ISO 639-3 code, and far more than that which should have codes. That does not mean we should fracture the English Wikisource community into 25 wikis. The subcommittee is supposed to ensure that new projects prosper, and this is best assured by not needlessly splitting a large community that can work together into many small communities. We're not a rubber stamp bot interfaced with the code database, we need to think about our decisions.

    If you'd like, we can hold a discussion in the Wikisource community to decide what the Wikisource community as a whole favours; they're better placed to know what they need than subcommittee members who don't even edit Wikisource. There are no communities in historical languages (if there were, they would not be historical).

    <http://www.pathos.ca/tools/ISO639DB/index.php?search=English>

  12. Bèrto 'd Sèra
    18 October 2007 06:57

    Hi!

    If individuals aren't relevant, than why don't we have but ONE wikisource? We can tag content as we please, we said, so it's not a tech matter. All that remains is people. Based on the same dynamics tomorrow you can split Commons. It will be people doing it, not content. We are not here to manage people, let them go where they want to go. We have more than enough trouble with people who are artificially kept together (see Koreans, Chinese, etc) when they really want to split.

  13. Bèrto 'd Sèra
    18 October 2007 07:10

    Hi,

    I split this thread, for convenience.

    As per ISO 639-3 codes, it's only 22 for English (http://www.sil.org/iso639-3/codes.asp?order=639_3&letter=%25 ):
    AIG Antigua and Barbuda Creole English
    ANG Old English (ca. 450-1100) Historical
    BAH Bahamas Creole English
    BZJ Belize Kriol English
    BZK Nicaragua Creole English
    CPI Chinese Pidgin English
    ENG English
    ENM Middle English (1100-1500) Historical
    FPE Fernando Po Creole English
    GCL Grenadian Creole English
    GUL Sea Island Creole English
    GYN Guyanese Creole English
    HWC Hawai'i Creole English
    ICR Islander Creole English
    JAM Jamaican Creole English
    LIR Liberian English
    SVC Vincentian Creole English
    TCH Turks And Caicos Creole English
    TGH Tobagonian Creole English
    TRF Trinidadian Creole English
    VIC Virgin Islands Creole English

    ALL of them describe independent languages that so far would have been treated as such on request (2 are historical). If we now explicitly want to forbid Creole languages, we must also shut down Cajun French and whatever else falls in that domain. Only, how do we identify "creole"? There's no such flag from any reliable external source. Do we issue "Creole yellow stars" ourselves? Based on what?

    Say we worry about English only. We still cannot say that Creole languages of French and German have a dignity, whilst English Creole languages have none. Again, based on what?

    Can you imagine the impact of saying that no matter ISO codes "Cajun French" IS a language and "Hawai'i Creole English" is NOT? Who of them is white and who isn't? BOTH are American citizens, but they would not happen to have the same rights granted from an American Charity... I'm not very keen about US Law, but I imagine this CAN be enough to revoke a 503/c Charity status and it's surely going to be major blow against any fundraiser.

    There is NO code for UK-English vs US-English, so there is NO problem for en.wiki. Please, don't look for trouble trying to decide whose ethnical group is better than another.

  14. Gerard Meijssen (GerardM)
    18 October 2007 09:16

    <this user has not agreed to public archival.>

  15. Jon Harald Søby
    18 October 2007 18:01

    Of course we shall allow creoles. Anything else would be discrimination.

  16. Jesse Plamondon-Willard (Pathoschild)
    18 October 2007 18:38

    Hello,

    I agree, wikis for living Creole languages that meet the policy are fine.


  17. 18 October 2007 21:04

    <this user has not agreed to public archival.>

  18. Jesse Plamondon-Willard (Pathoschild)
    18 October 2007 22:38

    Hello Karen,

    I've recompiled the ISO 639-3 database, so it's up to date as of today. I can do this again periodically when changes are made.


  19. 18 October 2007 22:50

    <this user has not agreed to public archival.>

Interface deregulation[edit]

  1. Bèrto 'd Sèra
    06 October 2007 01:53

    One of the recurrent problems we have is defined as follows:

    I don’t like the mainstream localization in our wiki. My regional flavor is different, we want our OWN massage file, but… my flav hasn’t got an ISO 639-3 code. Why can’t I have a “private code”?

    The answer is usually a NO. Yet, the number of such problems is mounting.

    I don’t want to have these codes emitted. At first, there is no way to check which is which, even in the places I have best knowledge of, languages that have this problems don’t have any published “normative” work that defines how to write what. It’s quite normal that places like EML and VEC let everyone free to “create your own grammar set”. So if we open the lock-gates to the flood the volume of new codes can become simply impossible to keep track of, let alone having any validity outside a single flat. You may pretty often have the case of people wishing to use alternate scripts they invented ourselves, etc (that’s why I can’t think they would be welcome in IANA, either). We cannot be those who decide whether a never previously codified version is “good” or “bad”, either. We’d only end up in having tons of private codes named ISO-x-author’s-name, which is simply stupid.

    So what I’m rather thinking is that we could request a system that uses javascript to substitute strings on the fly. People would edit their own localization.js on their user pages just like they edit their monobooks (see http://en.wikipedia.org/wiki/Wikipedia:Monobook), as a result:
    1. everyone is free to have the heck they want on THEIR OWN screens
    2. we don’t need to invent order from chaos and kill ourselves to make them happy

    In practice, this means good borders and freedom for all:
    1. If a WMF community of 15 people wants to have 16 different language files they are welcome to do so,
    2. WMF (and thus LangCom) does not rule the process in anyway,
    3. Betawiki doesn’t need to be disturbed,
    4. The international MW distribution doesn’t get fat with localizations that have no international recognition.

    Would this be fine for everyone?

    Another thing is “how to actually do it” but I’ll make tech enquiries in other places, if you give me the green light. Possibly, this will reduce itself to make a modified monobook that editions wishing to open the free interface gates may use. Once a string has a known ID javascript can substitute it directly in the browser.

  2. Gerard Meijssen (GerardM)
    06 October 2007 05:27

    <this user has not agreed to public archival.>

  3. Bèrto 'd Sèra
    06 October 2007 06:31

    Hoi,

    This is 2 different issues:
    1) you have a stable flavour of the language that can have a IANA code
    2) there is no stable evidence and they are still in a position in which they invite people to “discuss how we should write our language”

    In the second case, providing a code is risky. If we have a code, than fine, they make an official language file and everyone’s happy. But if they don’t get a code we are going to be bugged until forever, which is something I’d rather avoid, and they have no place to make experimental work to stabilize the written form.

    In this second case I believe we should give them the chance to have private client-based solutions. It’s not something that will require any intervention from our side, they can do javascript themselves and work on it. If and when they reach consensus, they can apply for a IANA code and transfer their work to an official release. But I’d rather avoid edit wars on interfaces, and they are likely to happen in such fluid environments.

  4. Gerard Meijssen (GerardM)
    06 October 2007 09:36

    <this user has not agreed to public archival.>

Wikipedia Bikol[edit]

No decision was taken regarding the request for a Bikol Wikipedia.

  1. Jesse Plamondon-Willard (Pathoschild)
    07 October 2007 17:14

    Hello,

    I propose the final approval of the Bikol Wikipedia. All initial criteria have been met, the test project is very successful (see <http://incubator.wikimedia.org/wiki/User:Pathoschild/Status/wp-bcl>), and the interface localization is complete.

    I'll approve it in 48 hours if nobody objects. If you need more time to formulate your objections, please say so.

  2. Shanel Kalicharan
    07 October 2007 18:42

    {{support}}.

  3. Bèrto 'd Sèra
    07 October 2007 18:52

    Hoi,

    I agree that pages do look good, and the quantity is absolutely sufficient.

    We have a further step before saying a final yes: is there any Bikol native speaker (external to the project) who can give us an independent confirmation that the language in which these pages are written actually IS Bikol?

    If not, which usually is the case, has anyone an idea of where we can get such verification from? So it's a *temporary oppose* for now, hopefully for a very short time.

    It does look like an austronesian language, yet "looking like"and "being" may be two different things. If nobody wants to deal with it, I'll volunteer to look for an indpendent confirmation, I can start on Monday (tomorrow morning). I'll find a contact in the Philipine Govt or Univs, to have them read the text and give the final green light.

  4. Bèrto 'd Sèra
    07 October 2007 18:55

    Take away my temporary oppose if any of us speaks Bikol and can provide the certification himself/herself, obviously.

  5. Jesse Plamondon-Willard (Pathoschild)
    07 October 2007 20:12

    I'll trust you with that remaining verification. Can you estimate how long that should take?

  6. Bèrto 'd Sèra
    07 October 2007 20:22

    Hi!

    Hopefully no more than a couple of days. I suppose we can identify a university in the Philippines and ask someone from a literature Dept. to have look at it.

    Hope they read their mail and are willing to answer. This much is out of my control. I'l copy this list with the mails I'm sending and ask them to copy the list with the answers, so we avoid POV interpretations.

  7. Gerard Meijssen (GerardM)
    07 October 2007 22:25

    <this user has not agreed to public archival.>

  8. Sabine Cretella
    07 October 2007 22:35

    I support the approval - but I find it good if Berto has the possibility to ask someone for the quality of the messages, so we should allow for that check before we publicly approve. And yes: in the end we have to assume good faith of people and knowledge of the language when we don't know the languages ourselves.

    As per the lengths of articles I would kindly ask you to read a blog of mine ... Bikol is one of the "less resourced languages" - at least IMHO. http://sabinecretella.blogspot.com/2007/08/length-of-articles-in-small-wikipedias.html

    Thanks, Sabine

    p.s. and all of you: please calm down now ...

  9. Bèrto 'd Sèra
    08 October 2007 22:47

    Hoi!

    The source I identify is this: http://www.dlsu.edu.ph/inside/organizations/lsp/journal.asp

    I chose Angela P. Sarile from this page: http://www.dlsu.edu.ph/inside/organizations/lsp/officers.asp

    based on the fact that Manila is located on the Luzon island, where most Bikol native speakers are.

    I will now copy the list with the mail I'm sending her.

  10. Bèrto 'd Sèra
    08 October 2007 23:23

    Dear Mrs. Sarile,

    I write on behalf of the Wikimedia Foundation Committee for Languages, to ask her advice on the final identification of a language. We have been starting a project to release a Bikol edition of Wikipedia, and now need to certify that the articles listed on this page really are written in Bikol. As it’s always the case when nobody in the Commission can produce direct certification, we require an expert third party to confirm the identity of the language in which the material is expressed, prior to release the project.

    The list of pages is published at: http://incubator.wikimedia.org/wiki/User:Pathoschild/Status/wp-bcl

    We would really appreciate if you could have a look at them and confirm that they really are written in Bikol, or if you could point us to another source for such confirmation.

    Thank you in advance for you time and attention

  11. Bèrto 'd Sèra
    10 October 2007 17:45

    This might be the reason why we still had no answers about Bikol

    Berto 'd Sera

    -----Original Message-----
    From: Mail Delivery Subsystem
    Sent: Wednesday, October 10, 2007 3:48 AM
    To:
    Subject: Delivery Status Notification (Delay)

  12. Bèrto 'd Sèra
    11 October 2007 11:14

    It keeps being delayed. I'll try another contact

  13. Bèrto 'd Sèra
    13 October 2007 05:20

    Dear Mrs. Ward,

    I write on behalf of the Wikimedia Foundation Committee for Languages, to ask her advice on the final identification of a language. We have been starting a project to release a Bikol edition of Wikipedia, and now need to certify that the articles listed on this page really are written in Bikol. As it’s always the case when nobody in the Commission can produce direct certification, we require an expert third party to confirm the identity of the language in which the material is expressed, prior to release the project.

    The list of pages is published at: http://incubator.wikimedia.org/wiki/User:Pathoschild/Status/wp-bcl

    We would really appreciate if you could have a look at them and confirm that they really are written in Bikol, or if you could kindly point us to another source for such confirmation.

    Thank you in advance for you time and attention

  14. Bèrto 'd Sèra
    13 October 2007 05:25

    Bad news again...

    Berto 'd Sera

    -----Original Message-----
    From: Mail Delivery Subsystem [mailto:mailer-daemon at googlemail.com]
    Sent: Saturday, October 13, 2007 8:20 AM
    To:
    Subject: Delivery Status Notification (Failure)

  15. Bèrto 'd Sèra
    13 October 2007 05:26

    Dear Mr. Sibayan,

    I write on behalf of the Wikimedia Foundation Committee for Languages, to ask her advice on the final identification of a language. We have been starting a project to release a Bikol edition of Wikipedia, and now need to certify that the articles listed on this page really are written in Bikol. As it’s always the case when nobody in the Commission can produce direct certification, we require an expert third party to confirm the identity of the language in which the material is expressed, prior to release the project.

    The list of pages is published at: http://incubator.wikimedia.org/wiki/User:Pathoschild/Status/wp-bcl

    We would really appreciate if you could have a look at them and confirm that they really are written in Bikol, or if you could point us to another source for such confirmation.

    Thank you in advance for you time and attention

  16. Bèrto 'd Sèra
    13 October 2007 05:31

    Ant this is the third. I'll try to contact the university via the online form.

    Berto 'd Sera

    -----Original Message-----
    From: Mail Delivery Subsystem
    Sent: Saturday, October 13, 2007 8:27 AM
    To:
    Subject: Delivery Status Notification (Failure)

  17. Bèrto 'd Sèra
    13 October 2007 05:38

    I just sent the following thru the online form of the university :

    I'm trying to contact someone from the Linguistic Society of the Philippines on behalf of the Language Committee of the Wikimedia Foundation for a technical issue regarding the Bikol language. Apparently all email addresses published at http://www.dlsu.edu.ph/inside/organizations/lsp/officers.asp are non existent. Can you point me to a working contact?

  18. Jesse Plamondon-Willard (Pathoschild)
    19 October 2007 16:15

    Has there been any progress? If there has been no response, we should look elsewhere for verification.

  19. Jesse Plamondon-Willard (Pathoschild)
    24 October 2007 00:41

    Hello,

    I propose that we approve this request now.

    If we can verify the language in a reasonable time frame, that's great. If we can't, we should go ahead without it. I think it is unlikely that they're faking the language, and this long bureaucratic wait with nothing happening is stifling the community.

  20. Shanel Kalicharan
    24 October 2007 01:07

    I agree. I think faking a language would be quite difficult without *someone* noticing. Sure, there could be some sort of elaborate scheme going on, but it's far more likely that they've actually written in the language. If there are no objections in 72 hours, I shall give it final approval.

  21. Gerard Meijssen (GerardM)
    24 October 2007 06:37

    <this user has not agreed to public archival.>

  22. Jesse Plamondon-Willard (Pathoschild)
    24 October 2007 07:29

    GerardM,

    This is not hurried; the request has been waiting for months. Verification is fine if we can get it, but it has been two and a half weeks and the best response we've gotten is a delivery failed notice.

    Do you or Berto you have any short-term plans to obtain verification, or are we just waiting for nothing to happen? If we are, let's just skip the extra bureaucracy and get on with it.

  23. Bèrto 'd Sèra
    24 October 2007 10:08

    Sorry no.

    You are welcome to identify an authoritative source that is unrelated to the requesting community and can provide verification for the request.

    We all perfectly know that saying YES is a non-return move. It can be done only when all possible doubts are gone. We are not going to tell the Board that they have to shut a wiki down because "us idiots did not even check in what language it was written". That would be simply ridicolous.

  24. Bèrto 'd Sèra
    24 October 2007 11:06

    Dear Mrs. Billings,

    I write on behalf of the Wikimedia Foundation Committee for Languages, to ask your advice on the final identification of a language. We have been starting a project to release a Bikol edition of Wikipedia, and now need to certify that the articles listed on this page really are written in Bikol. As it’s always the case when nobody in the Commission can produce direct certification, we require an expert third party to confirm the identity of the language in which the material is expressed, prior to release the project.

    The list of pages is published at: http://incubator.wikimedia.org/wiki/User:Pathoschild/Status/wp-bcl

    We would really appreciate if you could have a look at them and confirm that they really are written in Bikol, or if you could kindly point us to another source for such confirmation.

    Thank you in advance for you time and attention

  25. Bèrto 'd Sèra
    24 October 2007 11:07

    Dear Mr. Davies,

    I write on behalf of the Wikimedia Foundation Committee for Languages, to ask your advice on the final identification of a language. We have been starting a project to release a Bikol edition of Wikipedia, and now need to certify that the articles listed on this page really are written in Bikol. As it’s always the case when nobody in the Commission can produce direct certification, we require an expert third party to confirm the identity of the language in which the material is expressed, prior to release the project.

    The list of pages is published at: http://incubator.wikimedia.org/wiki/User:Pathoschild/Status/wp-bcl

    We would really appreciate if you could have a look at them and confirm that they really are written in Bikol, or if you could kindly point us to another source for such confirmation.

    Thank you in advance for you time and attention

  26. Bèrto 'd Sèra
    24 October 2007 11:08

    Dear Mr. Hidalgo,

    I write on behalf of the Wikimedia Foundation Committee for Languages, to ask your advice on the final identification of a language. We have been starting a project to release a Bikol edition of Wikipedia, and now need to certify that the articles listed on this page really are written in Bikol. As it’s always the case when nobody in the Commission can produce direct certification, we require an expert third party to confirm the identity of the language in which the material is expressed, prior to release the project.

    The list of pages is published at: http://incubator.wikimedia.org/wiki/User:Pathoschild/Status/wp-bcl

    We would really appreciate if you could have a look at them and confirm that they really are written in Bikol, or if you could kindly point us to another source for such confirmation.

    Thank you in advance for you time and attention

  27. Bèrto 'd Sèra
    24 October 2007 11:10

    Dear Mr. Himes,

    I write on behalf of the Wikimedia Foundation Committee for Languages, to ask your advice on the final identification of a language. We have been starting a project to release a Bikol edition of Wikipedia, and now need to certify that the articles listed on this page really are written in Bikol. As it’s always the case when nobody in the Commission can produce direct certification, we require an expert third party to confirm the identity of the language in which the material is expressed, prior to release the project.

    The list of pages is published at: http://incubator.wikimedia.org/wiki/User:Pathoschild/Status/wp-bcl

    We would really appreciate if you could have a look at them and confirm that they really are written in Bikol, or if you could kindly point us to another source for such confirmation.

    Thank you in advance for you time and attention

  28. Bèrto 'd Sèra
    24 October 2007 11:10

    Dear Mr. Himmelmann,

    I write on behalf of the Wikimedia Foundation Committee for Languages, to ask your advice on the final identification of a language. We have been starting a project to release a Bikol edition of Wikipedia, and now need to certify that the articles listed on this page really are written in Bikol. As it’s always the case when nobody in the Commission can produce direct certification, we require an expert third party to confirm the identity of the language in which the material is expressed, prior to release the project.

    The list of pages is published at: http://incubator.wikimedia.org/wiki/User:Pathoschild/Status/wp-bcl

    We would really appreciate if you could have a look at them and confirm that they really are written in Bikol, or if you could kindly point us to another source for such confirmation.

    Thank you in advance for you time and attention

  29. Bèrto 'd Sèra
    24 October 2007 11:12

    Dear Mr. Hsiu-chuan Liao,

    I write on behalf of the Wikimedia Foundation Committee for Languages, to ask your advice on the final identification of a language. We have been starting a project to release a Bikol edition of Wikipedia, and now need to certify that the articles listed on this page really are written in Bikol. As it’s always the case when nobody in the Commission can produce direct certification, we require an expert third party to confirm the identity of the language in which the material is expressed, prior to release the project.

    The list of pages is published at: http://incubator.wikimedia.org/wiki/User:Pathoschild/Status/wp-bcl

    We would really appreciate if you could have a look at them and confirm that they really are written in Bikol, or if you could kindly point us to another source for such confirmation.

    Thank you in advance for you time and attention

  30. Bèrto 'd Sèra
    24 October 2007 11:14

    Dear Mr. Lobel,

    I write on behalf of the Wikimedia Foundation Committee for Languages, to ask your advice on the final identification of a language. We have been starting a project to release a Bikol edition of Wikipedia, and now need to certify that the articles listed on this page really are written in Bikol. As it’s always the case when nobody in the Commission can produce direct certification, we require an expert third party to confirm the identity of the language in which the material is expressed, prior to release the project.

    The list of pages is published at: http://incubator.wikimedia.org/wiki/User:Pathoschild/Status/wp-bcl

    We would really appreciate if you could have a look at them and confirm that they really are written in Bikol, or if you could kindly point us to another source for such confirmation.

    Thank you in advance for you time and attention

  31. Bèrto 'd Sèra
    24 October 2007 11:15

    Dear Mr. McFarland,

    I write on behalf of the Wikimedia Foundation Committee for Languages, to ask your advice on the final identification of a language. We have been starting a project to release a Bikol edition of Wikipedia, and now need to certify that the articles listed on this page really are written in Bikol. As it’s always the case when nobody in the Commission can produce direct certification, we require an expert third party to confirm the identity of the language in which the material is expressed, prior to release the project.

    The list of pages is published at: http://incubator.wikimedia.org/wiki/User:Pathoschild/Status/wp-bcl

    We would really appreciate if you could have a look at them and confirm that they really are written in Bikol, or if you could kindly point us to another source for such confirmation.

    Thank you in advance for you time and attention

  32. Bèrto 'd Sèra
    24 October 2007 11:16

    Dear Mr. Potet,

    I write on behalf of the Wikimedia Foundation Committee for Languages, to ask your advice on the final identification of a language. We have been starting a project to release a Bikol edition of Wikipedia, and now need to certify that the articles listed on this page really are written in Bikol. As it’s always the case when nobody in the Commission can produce direct certification, we require an expert third party to confirm the identity of the language in which the material is expressed, prior to release the project.

    The list of pages is published at: http://incubator.wikimedia.org/wiki/User:Pathoschild/Status/wp-bcl

    We would really appreciate if you could have a look at them and confirm that they really are written in Bikol, or if you could kindly point us to another source for such confirmation.

    Thank you in advance for you time and attention

  33. Bèrto 'd Sèra
    24 October 2007 11:16

    Dear Mrs. Ruffolo,

    I write on behalf of the Wikimedia Foundation Committee for Languages, to ask your advice on the final identification of a language. We have been starting a project to release a Bikol edition of Wikipedia, and now need to certify that the articles listed on this page really are written in Bikol. As it’s always the case when nobody in the Commission can produce direct certification, we require an expert third party to confirm the identity of the language in which the material is expressed, prior to release the project.

    The list of pages is published at: http://incubator.wikimedia.org/wiki/User:Pathoschild/Status/wp-bcl

    We would really appreciate if you could have a look at them and confirm that they really are written in Bikol, or if you could kindly point us to another source for such confirmation.

    Thank you in advance for you time and attention

  34. Bèrto 'd Sèra
    24 October 2007 11:17

    The last email addresses were taken from this page: http://iloko.tripod.com/linguist.html

    Feel free to send more similar invitations.

  35. Arria Belli (Maria Fanucchi)
    24 October 2007 11:40

    Your point would be quite correct, I think, if the language in question were better-known. As it is, if the entire thing is fake I'm afraid it may potter along for months before anyone notices and is not too afraid to point out our mistake to us. (Because yes, it may be scary for them to try to tell the world-reknown Wikipedia that one of their sites is bogus and should be shut down. It's all human nature. "Me? But I'm nothing!")

    I suppose that's my long-winded, rambling way of saying that I agree with Berto about erring on the side of caution.

  36. Bèrto 'd Sèra
    24 October 2007 11:59

    Bikol: some failures

    See the attached messages

  37. Bèrto 'd Sèra
    24 October 2007 14:48

    This is an answer, at least, can anyone forward the original message? I'll that myself later tonite, if nobody can.

    -----Original Message-----
    <this text is quoted from a user who has not agreed to public archival.>

  38. Arria Belli (Maria Fanucchi)
    24 October 2007 15:01

    Dear Mrs. Mattes,

    I write on behalf of the Wikimedia Foundation Committee for Languages, to ask your advice on the final identification of a language. We have been starting a project to release a Bikol edition of Wikipedia, and now need to certify that the articles listed on this page really are written in Bikol. As it's always the case when nobody in the Commission can produce direct certification, we require an expert third party to confirm the identity of the language in which the material is expressed, prior to release the project.

    The list of pages is published at: http://incubator.wikimedia.org/wiki/User:Pathoschild/Status/wp-bcl

    We would really appreciate if you could have a look at them and confirm that they really are written in Bikol, or if you could kindly point us to another source for such confirmation.

    Thank you in advance for you time and attention

  39. Arria Belli (Maria Fanucchi)
    24 October 2007 15:02

    Dear Mr. Rubino,

    I write on behalf of the Wikimedia Foundation Committee for Languages, to ask your advice on the final identification of a language. We have been starting a project to release a Bikol edition of Wikipedia, and now need to certify that the articles listed on this page really are written in Bikol. As it's always the case when nobody in the Commission can produce direct certification, we require an expert third party to confirm the identity of the language in which the material is expressed, prior to release the project.

    The list of pages is published at: http://incubator.wikimedia.org/wiki/User:Pathoschild/Status/wp-bcl

    We would really appreciate if you could have a look at them and confirm that they really are written in Bikol, or if you could kindly point us to another source for such confirmation.

    Thank you in advance for you time and attention

  40. Arria Belli (Maria Fanucchi)
    24 October 2007 15:02

    Done.

  41. Bèrto 'd Sèra
    24 October 2007 15:07

    Thanks :-) Hopefully we can solve this issue

  42. Arria Belli (Maria Fanucchi)
    24 October 2007 17:08

    This is the answer I got a few minutes ago. The person he CC'ed my message to is Jason Lobel <email address censored>.
    --Maria

    ---------- Forwarded message ----------
    <this text is quoted from a user who has not agreed to public archival.>

  43. Arria Belli (Maria Fanucchi)
    24 October 2007 17:14

    Thank you for your speedy response, Mr. Rubino, and thank you for cc'ing my message along.

    Mr. Lobel and Mr. Rubino, whenever you can, please write your impressions about the test Bikol Wikipedia to the LangCom's mailing list (<email address censored>) as well as anyone else you think may be able help, so that all the members of the committee may join in the discussion and see what we can do for this project.

  44. Jesse Plamondon-Willard (Pathoschild)
    24 October 2007 17:20

    Hello,

    For the sake of the public archives (which won't include the response you quoted), one of the emailed users confirmed the pages were Bikol and referred us to an expert.

  45. Bèrto 'd Sèra
    24 October 2007 18:06

    So it's approved as far as I'm concerned

  46. Jesse Plamondon-Willard (Pathoschild)
    24 October 2007 18:25

    Hello,

    Great. I'll mark it as approved if nobody objects within 24 hours.

  47. Gerard Meijssen (GerardM)
    24 October 2007 18:48

    <this user has not agreed to public archival.>

  48. Shanel Kalicharan
    24 October 2007 20:55

    Rushing is never good, but so is making what is clearly an interested community wait so long. Wait too long, and their interest will understandably wane. IIRC, there is one user who has quit so far because it's taking so long to be approved. There needs to be a balance of both.

  49. Gerard Meijssen (GerardM)
    24 October 2007 21:24

    <this user has not agreed to public archival.>

  50. Jesse Plamondon-Willard (Pathoschild)
    25 October 2007 02:34

    Hello,

    Just relaying the message:

    "Hi,

    I was referred to this page by way of the [[en:Wikipedia:Tambayan_Philippines]]. I have worked with Jason Lobel & Carl Rubino above and am active on Wikipedia. As someone who has studied the language in the past, I can confirm that the language in the Bikol Wikipedia is indeed in Bikol. --[[en:User:Christopher Sundita|Chris S.]]"
    -- <http://meta.wikimedia.org/wiki/?diff=722627>

  51. Gerard Meijssen (GerardM)
    25 October 2007 05:34

    <this user has not agreed to public archival.>

  52. Jesse Plamondon-Willard (Pathoschild)
    25 October 2007 06:01

    Hello,

    *Now* would it be rushing to propose approval in 24 hours if nobody objects? :)

  53. Bèrto 'd Sèra
    25 October 2007 06:23

    This worries me a bit. It’s basically none of our business to set up an orthography and nothing in the policy says we should. Yet, most of the trouble we had in the past comes from poor management of such issues.

    I will not call back my approval for Bikol, but I do ask you whether we should add a requirement for communities to fix a policy about such stuff (in the future) prior to be approved

    Bèrto ‘d Sèra

    _____

    From: Indonesian/Malay Texts [mailto:<email address censored>]
    Sent: Thursday, October 25, 2007 5:09 AM
    To: "Bèrto 'd Sèra"
    Subject: Re: Bikol language
    <this text is quoted from a user who has not agreed to public archival.>

  54. Gerard Meijssen (GerardM)
    25 October 2007 07:11

    <this user has not agreed to public archival.>

  55. Bèrto 'd Sèra
    25 October 2007 11:47

    Hi!

    Thank you very much for your time and advice :-) We will report the advice about consistency and the compliments to the community. We all welcome the newborn Bikol wikipedia.

    Bèrto ‘d Sèra

    _____

    From: Indonesian/Malay Texts [mailto:<email address censored>]
    Sent: Thursday, October 25, 2007 2:39 PM
    To: Bèrto 'd Sèra
    Subject: RE: Bikol language
    <this text is quoted from a user who has not agreed to public archival.>

  56. Arria Belli (Maria Fanucchi)
    25 October 2007 12:15

    Just forwarding it along; I responded to her saying that no, there's no need to check every page, all we needed to know was whether it really was Bikol.
    --Maria

    ---------- Forwarded message ----------
    From: veronika mattes <<email address censored>>
    Date: Oct 25, 2007 5:01 AM
    Subject: Re: Bikol language
    To: Maria Fanucchi <<email address censored>>
    <this text is quoted from a user who has not agreed to public archival.>

  57. Jesse Plamondon-Willard (Pathoschild)
    25 October 2007 12:15

    GerardM,

    Bikol is already in the message files. I'll mark it as approved tonight (at the 24-hour mark) if nobody objects.

  58. Bèrto 'd Sèra
    25 October 2007 12:22

    Okay for me :)

    Can anyone forward the compliments they have received from our sources? So they get value for the money :) Since they waited for us, at least they get to know that we did look at what they did and managed to market their edition a bit, in the meantime.

  59. Bèrto 'd Sèra
    25 October 2007 12:23

    Definitely :-) They all have been more than nice and other sources already took a deeper look into it..

  60. Jesse Plamondon-Willard (Pathoschild)
    25 October 2007 12:24

    Hello,

    I'll leave them a message summarizing our discussions and the very positive results of the verification tonight. :)

  61. Jesse Plamondon-Willard (Pathoschild)
    CC Erik Möller (Executive Secretary)
    26 October 2007 17:58

    Hello Erik,

    The language subcommittee recommends the creation of a Bikol Wikipedia. The community is sufficiently diverse and active, the discussion on Meta is unanimous, the test project is successful, the interface is translated, and the language has a standard code. Further information can be found by following the links below.

    The request will be approved and created if the board does not object within four days, as previously agreed and described in our charter.

    More information:

  62. Erik Möller (Eloquence) Executive Secretary
    26 October 2007 13:59

    <this user has not agreed to public archival.>

  63. Bèrto 'd Sèra
    28 October 2007 14:29

    Thank you !

    We already managed to securely identify the language as Bikol and the bikol wikipedia is currently being released :-)

    Again, thank you for your time and attention

    Bèrto ‘d Sèra

    _____
    From: potetjp [<email censored>]
    Sent: Sunday, October 28, 2007 2:00 PM
    To: Bèrto 'd Sèra
    Subject: Re: Bikol language

    <this text is quoted from a user who has not agreed to public archival.>

  64. Jesse Plamondon-Willard (Pathoschild)
    01 November 2007 14:04

    Approved; I'll file a request for the wiki's creation.

Wikipedia Seri[edit]

No action was taken on the request for a Seri Wikipedia.

  1. Jesse Plamondon-Willard (Pathoschild)
    07 October 2007 20:52

    Hello,

    I've conditionally approved the Seri Wikipedia per the new procedure. The current status is listed on the request page <http://meta.wikimedia.org/wiki/Requests_for_new_languages/Wikipedia_Seri>.

  2. Gerard Meijssen (GerardM)
    07 October 2007 21:04

    <this user has not agreed to public archival.>

  3. Bèrto 'd Sèra
    07 October 2007 21:13

    Hmmm gentlemen... maybe I’m dumb, but on the given link I can’t see any approval status and any statement about content... Am I looking at the right place? http://meta.wikimedia.org/wiki/Requests_for_new_languages/Wikipedia_Seri

  4. Gerard Meijssen (GerardM)
    07 October 2007 21:15

    <this user has not agreed to public archival.>

  5. Bèrto 'd Sèra
    07 October 2007 21:21

    Can’t we simply conditionally approve it without statements about content ? if the ISO code is there, why not ?

  6. Gerard Meijssen (GerardM)
    07 October 2007 21:30

    <this user has not agreed to public archival.>

  7. Jesse Plamondon-Willard (Pathoschild)
    07 October 2007 21:40

    Gerard,

    I'm still confused. Just this week you accused me of stupidity for privately objecting to a conditional approval on procedural grounds, but your main (and public) objection to this conditional approval is procedural? Do you object to the conditional approval itself, or to some procedure that was not followed, or to the checklist, or to a particular step listed as done?

    To change the status back to the discussion phase, change "conditional" to "open". To remove something from the checklist, remove "done" beside that parameter. If you don't know how to change something, contact me and I will help you.

  8. Bèrto 'd Sèra
    07 October 2007 21:47

    May I say that if we had a ticket service NONE of this would happen? I know I'm boring, but still...

  9. Sabine Cretella
    07 October 2007 21:57

    No, you are not boring ... I personally don't like ticket services, but you are right ...

    uhmmmm ... please ... some less fights would be nice ... some clarifying is necessary, but only when you all have calmed down ... please ...

    /me back to do some work and then go to bed ...

  10. Gerard Meijssen (GerardM)
    07 October 2007 21:50

    <this user has not agreed to public archival.>

  11. Jesse Plamondon-Willard (Pathoschild)
    07 October 2007 22:09

    Gerard,

    There is a difference between calmly proposing a change and unilaterally reverting one; I'll leave it to you to decide which is more appropriate.

    I strongly disagree. The subcommittee's purpose is not only to randomly rubberstamp requests, but also to guide eligible communities through the creation. This is demonstrated by the users confused by your conditional approvals without explanation contacting various subcommittee members to ask what they need to do next. It is important to explain what remains to be done, which is accomplished by the checklist on the page.

    By the way, the request is at <http://meta.wikimedia.org/wiki/Requests_for_new_languages/Wikipedia_Seri>; I forgot to link to it.

  12. Bèrto 'd Sèra
    07 October 2007 22:21

    Jesse...

    "sufficiently unique" really is bullshit.

    So is a "number of native speakers" that nobody can check. I have to find a "real" native speaker for Bikol tomorrow, and that's the only check you can do. Even in that case... how do we tell that the person answering is capable of giving a judgement and actually even just knows what Bikol is?

    We can only trust... so those numbers of native speakers are, at the best, empty. Pls be realistic.

  13. Gerard Meijssen (GerardM)
    07 October 2007 22:21

    <this user has not agreed to public archival.>

  14. Gerard Meijssen (GerardM)
    07 October 2007 22:18

    <this user has not agreed to public archival.>

  15. Jesse Plamondon-Willard (Pathoschild)
    07 October 2007 22:26

    Hello,

    Alright. I strongly disagree with the way you unilaterally conditionally approve requests, and you strongly disagree with the way I do so. We're both violating the policy in doing so.

    My suggestion is that we both conditionally approve requests in the manner normally done and described by policy: make a proposal on the mailing list, and implement it in 48 hours if there is no objection. This eliminates the need for the conflict that has filled the mailing list recently, and a 48-hour delay on a private mailing list adds no particular bureaucracy or difficulty to either subcommittee members or requesting users.

  16. Gerard Meijssen (GerardM)
    07 October 2007 22:36

    <this user has not agreed to public archival.>

  17. Shanel Kalicharan
    07 October 2007 22:53

    Gerard,

    Conditional approval does not simply allow the start of a language in the Incubator. We've discussed this before. Conditional approval specifically means that we'll approve a wiki if the final requirements are met (subject to our discretion). This is not my opinion, it is the definition given by the policy. All those projects you've conditionally approved under the policy will expect to be approved once they meet all requirements, regardless of your personal idea of what conditional approval should be.

    I've proposed eliminating conditional approval in favour of the new checklist (keeping the final decision step), but there was no positive response.

    Berto,

    If you think the policy should be changed, please concretely and concisely explain your proposal in a new thread. I think I would not object to most of your proposed changes.

  18. Jesse Plamondon-Willard (Pathoschild)
    07 October 2007 22:57

    Hello,

    I accidentally sent that message from Shanel's email address because I'm using her computer at the moment. I apologize for any confusion. :)

  19. Gerard Meijssen (GerardM)
    07 October 2007 23:06

    <this user has not agreed to public archival.>

  20. Jesse Plamondon-Willard (Pathoschild)
    07 October 2007 21:27

    Gerard,

    I don't understand your position. Conditionally approving a request without explanation is fine, but doing so while simply checking off demonstrably fulfilled requirements is not? Further, politely proposing revoking conditional approval is "stupid" and demonstrates Hitler-like qualities, but blanket-reverting another member's good-faith edits is fine?

Wikipedia Ottoman Turkish[edit]

The second request for a Ottoman Turkish Wikipedia was rejected.

  1. Jesse Plamondon-Willard (Pathoschild)
    16 October 2007 03:21

    Hello,

    I propose the conditional approval of the Ottoman Turkish Wikipedia. I've compiled statistics and links at <http://incubator.wikimedia.org/wiki/User:Pathoschild/Status/wp-ota>. The only missing criteria for final approval is localization.

    I'll implement this decision if there are no objections in 48 hour; if you need more time to formulate your objections, please say so.

  2. Shanel Kalicharan
    16 October 2007 03:38

    Support. Very active test project with many active editors.

    Shanel

  3. Gerard Meijssen (GerardM)
    16 October 2007 05:47

    <this user has not agreed to public archival.>

  4. Gerard Meijssen (GerardM)
    16 October 2007 06:32

    <this user has not agreed to public archival.>

  5. Bèrto 'd Sèra
    16 October 2007 08:46

    OPPOSE: This is an “historical language”. Unless anyone can provide evidence of a “living community” that can use such a wiki I oppose the project no matter how much work they put in it. The WMF mission is not in making empty intellectual games. They are absolutely welcome to make a wikisource, though. That really would be extremely useful to living people.

  6. Jon Harald Søby
    16 October 2007 10:10

    I agree with Bèrto, opposing on the basis of it being a historical language. If we say yes to Ottoman Turkish, then we have to say yes to all other historical language requests as well. They are welcome to open a Wikisource or to take their Incubator content and make an Ottoman Turkish encyclopedia somewhere else (like Wikia), but I do not think it should be a Wikipedia.


  7. 16 October 2007 17:00

    <this user has not agreed to public archival.>

  8. Sabine Cretella
    17 October 2007 20:49

    I got this note on my user talkpage:


    Approval of a new Wikipedia for Ottoman Turkish

    Sabine, As you may know currently there is discussion by Language Subcommittee members regarding the faith of Ottoman Turkish Test Wikipedia. Me and other contributers who have involved in developping the test Ottoman Turkish wikipedia, over the last year, are quite concerned that some members of the committee are opposed to the creation of the active Wikipedia for this language, on the ground of that this is not a live language. May I point out that evenso this is language is not live in sense of being used in media and publications, there are various communities which use this language as a littrary form of their local dialect. For instance Azeris in Iran, and Turkmen in Iraq use this language as litrary form of their language. Also like to point out that, if we concider precedence, currently I know of following historical languges who have active Wikipedia projects:

    We appriciate your involvment and other memebrs of the Subcommittee in the discussion and hope for the positive outcome of the decision.

    Thank you --Mehrdad <http://meta.wikimedia.org/wiki/User:Mehrdad> 18:14, 17 October 2007 (UTC)


    It is for your info and for further discussion - sorry, I cannot follow. I just came home from the hospital and need some sleep - don't know for how many days this will go ahead.

  9. Jesse Plamondon-Willard (Pathoschild)
    17 October 2007 18:17

    Hello,

    This comment seems relevant to the issue here: <http://meta.wikimedia.org/wiki/Requests_for_new_languages/Wikipedia_Ottoman_Turkish_2?diff=712458&oldid=690018>. If this can be verified, would it be enough for those here who object to revisit the question, or do we depend entirely on the type label assigned by the ISO 639-3 committee?

  10. Bèrto 'd Sèra
    18 October 2007 05:59

    Hi!

    If you discuss just one ISO label, than you have no way to refuse to discuss them all, which is not what we are here for. My POV remains that if they feel ISO is wrong they should fix their problem with ISO, not with us.

    I had already mentioned the international role of Ottoman Turkish and I can imagine that at least some of their claims of a use as a "classical language" (thus comparable to Latin and ancient Greek) *might* have some grounds.

    Turning this conditional into a present tense requires more than just a personal claim, though. It requires definite data about the number of state or private school programs in which OT has a relevant role, an estimate of the population getting instruction in OT and MOST OF ALL it needs reliable external official sources that can back their claim. There cannot be a "classical language" where there is no documented instruction program.

    The fact that other languages were released prior to the current policy is not an acceptable issue. At some point even Klingon was released.

    INHO, their claim remains a reason not to push them to a Modern Turkish Wikisource, though. If they can make such a wikisource as a start-up project they can also use it as an evidence to back their claim of a consistent number of living speakers for the language. I rate it understandably difficult for them to collect official data from areas of the planet that are very badly connected to the net, so they might want to contact a number of European Turkish schools, too. They badly need data and numbers if they want to discuss with ISO.


  11. 18 October 2007 22:45

    <this user has not agreed to public archival.>

  12. Jesse Plamondon-Willard (Pathoschild)
    21 October 2007 15:53

    Hello,

    If I understand correctly, historical languages are *not* extinct. When there no living native speakers, they are classified as extinct (if they went extinct in recent centuries) or ancient (if they went extinct many centuries ago). The "historical" type means that there are native speakers in that language, but that a newer recognized variant of the language exists.

    Given that, is there still opposition to conditionally approving the Ottoman Turkish Wikipedia?

    <http://www.sil.org/ISO639-3/types.asp>

  13. Bèrto 'd Sèra
    21 October 2007 16:08

    Hi!

    There are NO Ottoman NATIVE speakers on earth; otherwise the language would be classified as "living". An historical language represents a "phase" in the history of a living language, and when the phase is not too faraway in time (as with OT) the language can be still locally used as a "literary register".

    In this sense, Classical Greek was used for much of the Byzantine Empire. Nobody would use it for everyday life, but people would get education in it. They wouldn't be "native speakers" in it, and they'd basically hardly need it for ordinary life, but it would be vital for them to be accepted in Court. Same happened with Latin for most of the late Middle Age, since the Council of Tours in 813 explicitly ordered to abandon Latin in Liturgy for the ordinary people (as even a good amount of clerics wasn't able to use it anymore).

    The opposition remains, unless it gets proved that the number of people receiving education in OT in a wide number of countries can qualify it for a "classical language" role. In this case parity of rights with Classical Greek and Latin should be granted.

    It has nothing to do with "dignity", it's a mere matter of extension in its usage. Otherwise you'll have to release a wiki in Old Icelandic because a few neo-pagans have their children study in the same language of the Eddur (http://en.wikipedia.org/wiki/Edda ).

  14. Bèrto 'd Sèra
    21 October 2007 16:22

    Hi again,

    The main aspect involved with historical languages is that they are comparable to a software branch that is NOT undergoing any further development. The current "living" branch is another; in this case, it is Modern Turkish.

    By allowing further production in "closed branches" you end up producing "attested forms" that cannot have any validity, because that language is "sealed". It can definitely still be used as a literary language (and sometimes as a scientific language, like Latin and Greek in biology and international pharmacy) but it is not expected to behave as a living language.

    Whenever you start semantic production in any of these languages you get to form what is a "resurrected" version, i.e., by all means to a "construct language" that would only exist within WMF. I can hardly imagine this to be included in WMF's mission.

  15. Bèrto 'd Sèra
    26 October 2007 07:21

    Hi!

    I think this may be of interest http://meta.wikimedia.org/wiki/User_talk:Mehrdad#Ottoman_turkish

  16. Jesse Plamondon-Willard (Pathoschild)
    28 October 2007 22:22

    Hello,

    I propose the Ottoman Turkish Wikipedia be rejected based on this discussion. The request can resubmitted if the ISO 639 committee reclassifies Ottoman Turkish, which is unlikely.

    I'll implement the decision in 72 hours if there is no objection. If you need more time to formulate your objections, please say so.

  17. Shanel Kalicharan
    28 October 2007 23:12

    Nuuuuuuuuuu.




    Nuuuuuuuu objection.

  18. Bèrto 'd Sèra
    29 October 2007 07:29

    Wait, they have provided initial data about OTA being taught in schools in a role that would qualify for a “classical language”. If this proves to have enough territorial extension (span over a number of countries and have some definite literary or scientific target use) we cannot deny them without discriminating them. Data isn’t complete at the moment, but they are working on it and should be allowed time to prove their point.

  19. Gerard Meijssen (GerardM)
    29 October 2007 07:53

    <this user has not agreed to public archival.>

  20. Bèrto 'd Sèra
    29 October 2007 08:07

    Hi!

    I’m not clear on this. The current classification set at ISO says a language is one of:

    1. living
    2. extinct
    3. ancient
    4. historic
    5. constructed

    We want to warrant all “living” codes a pre-conditional approval and also warrant a non-discriminating policy towards those that can qualify for a role similar to “latin and greek” (“classical” in our terminology, thus far). How a change of classification at ISO level would tell us that this is “latin”?

  21. Gerard Meijssen (GerardM)
    29 October 2007 08:34

    <this user has not agreed to public archival.>

  22. Bèrto 'd Sèra
    29 October 2007 09:15

    Hi!

    This is okay, both Greek and Latin are in this situation because if the Roman Church and scientific usage. This is why I want to understand what OTA is used for in school programs. That is, whether semantic production is supposed to be undergoing or not OUTSIDE WMF.

    My question was about how such a use would change the OTA ISO 636-3 label. Latin is currently classified as “ancient”, for example. Given the 80 years long time span (1928 seems to be the official border between OTA and TUR) I can’t imagine any other label apart from “Historical” for OTA and ISO doesn’t seem to have any label that can mark whether semantic production is currently undergoing or not.


  23. 29 October 2007 22:01

    <this user has not agreed to public archival.>

  24. Shanel Kalicharan
    29 October 2007 22:21

    If that's the case, then I do have an objection (Too bad Jesse. ;) ).

  25. Gerard Meijssen (GerardM)
    29 October 2007 22:26

    <this user has not agreed to public archival.>

  26. Gerard Meijssen (GerardM)
    29 October 2007 22:40

    <this user has not agreed to public archival.>


  27. 30 October 2007 01:07

    <this user has not agreed to public archival.>

  28. Bèrto 'd Sèra
    30 October 2007 11:03

    Hi!

  29. Jesse Plamondon-Willard (Pathoschild)
    04 November 2007 20:01

    Hello,

    For those who haven't gotten Mehrdad's message regarding the teaching of Ottoman Turkish in universities, see <http://meta.wikimedia.org/wiki/User_talk:Pathoschild?oldid=735629#Status_of_Ottoman_Turkish_in_Education>.

  30. Bèrto 'd Sèra
    04 November 2007 20:10

    Hi!

    Well, these are numbers. Positive numbers. I'd say these guys deserve consideration, at this point.

    It's NOT as big as latin, but from the other hand the range of subjects:

    • Turkish language and Literature (Turk Dili ve Edebiyati Bolumu)
    • Art history (Sanat Tarihi bolumunda)
    • History (Tarih Bolumu)
    • Turkish language teachers (Turkce Ogretmenligi)
    • History teachers (Tarih Ogretmenligi)
    • Literature teachers (Edebiyat Ogretmenligi)

    does make for a "classical role".

    I would add that most probably the language has or it is likely to have a similar role in a number of other former-ottoman countries.

    I give a conditional YES. My condition is that I want to hear everyone's objections, if any, and if any of them makes sense I'll retire my positive vote.

  31. Shanel Kalicharan
    05 November 2007 00:36

    Given the evidence he's provided, I also have no objection to conditional approval.

  32. Gerard Meijssen (GerardM)
    05 November 2007 05:01

    <this user has not agreed to public archival.>

  33. Bèrto 'd Sèra
    05 November 2007 06:35

    Hi!

    It has not been suggested that it is living, at least, not by the requesting party. The requester wrote that he agrees with the ISO label as “historical”, and specifically “until 1928”.

  34. Gerard Meijssen (GerardM)
    05 November 2007 06:42

    <this user has not agreed to public archival.>

  35. Bèrto 'd Sèra
    05 November 2007 06:56

    Hi!

    I’m not, either. Yet, we the WMF historically warranted projects to western “classical” languages… it would look quite discriminating if we refused this, given that it’s actively employed in teaching.

  36. Gerard Meijssen (GerardM)
    05 November 2007 07:08

    <this user has not agreed to public archival.>

  37. Jesse Plamondon-Willard (Pathoschild)
    05 November 2007 07:46

    Hello,

    I also object to allowing new wikis in historical languages.

    The Wikimedia Foundation seeks to provide the sum of human knowledge to every human being. That means that the single purpose of allowing a project in a new language is to make it accessible to more human beings. If a language is historical, allowing a project in that language does not fit our mission because it does not make that project accessible to more human beings.

    If this language can be shown to have native communities (not speakers of other languages who studied it in university), than I am all for approving it. Otherwise, I oppose; we should instead allow a project in their native languages if we have not already done so.

  38. Bèrto 'd Sèra
    05 November 2007 08:04

    Okay, this means that ALL historical languages get automatically refused. So the policy for cond-appr drops the "discussion for historical languages" bit.

    Everyone is okay with this? I am. As stated, my only concern was a matter of fair play to non-Europeans, but logically speaking Latin shouldn't get a wiki. Because it's got NO native speakers. If we choose to be logical we can drop another whole lot of endless discussions and concentrate on projects that have living users.

  39. Gerard Meijssen (GerardM)
    09 November 2007 05:53

    <this user has not agreed to public archival.>

  40. Bèrto 'd Sèra
    09 November 2007 09:32

    If this is true we should tell ISO to update the label to “living” and send them evidence. “Up to 1928” is “historical”, “used today” is “living”, no matter how many native speakers. Both TUR and OTA are based on the last OTA version, the main difference being in the script, plus 80 years of semantic drift (which IMHO cannot be enough to reach full mutual unintelligibility). I don’t think we should start to modify the label ourselves, because this would set a dangerous precedent, yet I’m in favor of starting an update process if and when we are given evidence that the language is still “living”.


  41. 12 November 2007 00:46

    <this user has not agreed to public archival.>

  42. Shanel Kalicharan
    12 November 2007 00:56

    Well, that sucks. >_>. Thanks for asking though, Karen.

  43. Bèrto 'd Sèra
    12 November 2007 04:31

    Hi!

    Yes, getting in touch with ISO IS the right thing to do. As I always said NO standard can be expected to be perfect, especially in such a field. I bet the number of such grey areas will only grow as we enter the land of less known linguistic entities. Basically this is a field in which the degree of classification of an entity largely depends on how much the entity came to interest scholars. I may be cynical, but actually small niche languages make better academic grants, so… while we may end up in having a wonderful degree of knowledge about an entity spoken by 300 people an entity spoken by millions may remain largely unaddressed. Scholars need to pay they rents as anybody else.

    This is where the WMF and specifically our own activity shall put pressure on ISO in order to reach a “better standard”. While there’s no use in making a WMF-standard there is a large need for any standard to be become aware of their flaws, and since we happen to be the planet’s largest diffused cultural institution it is our duty to follow the “be bold” stance and to go “edit ISO” (or whatever other standard we had chosen). It may not yield to immediate results but it is something we must do, because standards rule the life of the billions out there, and it’s everybody’s social duty to make sure that available rules are “fair” rules. Today it’s a wiki, tomorrow it’s an operative system UI… when we make exceptions we might “look nice”, yet in the end we don’t help anyone, because what we do is bound to remain a “singularity” of ours. When we help correcting an international standard we pave the way for a community to achieve fair rights in a planet-wide context.

    Localization will not always remain at the current naïve stage. It’s already evolving into something closer to an industry as companies are starting to realize that delivering localized content and software is one of the most powerful marketing tools. The further we go that way the more standards will end up in dictating rules, so “standard maintenance” is a must, if we don’t want to advantages of this evolution to reach only some “happy few”. Rome wasn’t built in one day, OTA may not be properly classified in a week, yet it’s worth fighting to open a formal channel for our enquiries and/or requests for change. It’s much more than OTA at stake.

    I will answer to other comments on the choice of ISO as a standard separately, as they do not specifically involve OTA.

  44. Bèrto 'd Sèra
    05 December 2007 17:48

    Hi all,

    Do we have a final answer for Ottoman Turkish?

  45. Bèrto 'd Sèra
    09 December 2007 21:37

    Last call... if nothing happens before Xmas I'll take the liberty of refusing the project before the New Year. We cannot keep them working until forever if we are not willing to even discuss the issue.

  46. Gerard Meijssen (GerardM)
    09 December 2007 23:05

    <this user has not agreed to public archival.>

  47. Jesse Plamondon-Willard (Pathoschild)
    09 December 2007 23:18

    Hello,

    As I recall, we asked for evidence of living native communities. If it is actually historical (possibly taught as an optional in university, but no native speakers), it should be rejected.

  48. Bèrto 'd Sèra
    09 December 2007 23:31

    We got no native speakers evidence, as rumors do not count as evidence. We DO have official data about the language being used for teaching, though. The only PROVEN equation we can build at this point is with Latin.

    I'm ready to allow that there's probably going to be a great lack of official data (and strong political bias) in the area, but nevertheless I cannot see how WE can decide whether a language is spoken or not, in the absence of such data.

    Mine is a "no", with a hint for them to ask for a modification of the ISO flag, in case their language really is alive.

  49. Jesse Plamondon-Willard (Pathoschild)
    09 January 2008 08:26

    Hello,

    I have rejected this request based on the extensive subcommittee discussion with the following explanation:

    "It is the policy of the language subcommittee that only languages with living native communities may create new wikis. Ottoman Turkish is classified by ISO 639-3 as "historical", which means that it is an older form of Turkish. We have asked for evidence of living native Ottoman Turkish communities from various Ottoman Turkish requesters and publicly on other pages; while we received some evidence of non-native use (such as in academia), unfortunately none have been able to provide any evidence of native use.

    If such evidence can be provided, please notify the subcommittee at Talk:Language subcommittee so that we can review this decision."

    < http://meta.wikimedia.org/wiki/Requests_for_new_languages/Wikipedia_Ottoman_Turkish_2 >

  50. Bèrto 'd Sèra
    09 January 2008 08:30

    Would we? If such evidence is present it should actually be ISO to review their classification prior to us moving. Anyway... this is theory.

  51. Sabine Cretella
    09 January 2008 09:29

    Pathoschild, in order to avoid trouble with many "outside sources that would come along", could you please change the message according to my note left on the talk page:

    http://meta.wikimedia.org/wiki/Talk:Requests_for_new_languages/Wikipedia_Ottoman_Turkish_2

    "Please change the last part of the comment for rejection "so that we can review this decision" into: so that we can reconsider the project once the ISO 639-3 code has been amended. The request for changes on the ISO code has to follow the procedure of SIL - and please note that I underline SIL - having a change in Ethnologue is not enough - the change must be with the maintainer of ISO 639-3. The Language Committe bases its decisions on ISO 639-3 language codes and not on evidence brought to us in "just some way". I am not changing this myself since the note was signed by another member of the langcom. Thanks!"

    Thank you :-)

  52. Jesse Plamondon-Willard (Pathoschild)
    09 January 2008 10:52

    Sabine,

    That wouldn't be accurate, though. The "historical" type does not mean that there cannot be living communities, it only suggests that there are none. If the requesters can provide evidence of living communities, it would not contradict the classification, but it would reverse the reasoning for the rejection.

    Thus, we'd need to review the decision once such evidence is available rather than wait for an ISO code change.

  53. Bèrto 'd Sèra
    09 January 2008 11:15

    Hi!

    I would block such a review, Jesse. A "Living" language is a living language. We must NOT have a space for LangCom to alter ISO in anyway.

    As I said often before, I myself am unhappy with their classifications sometimes, but it's none of LangCom's business to issue classification of languages. Historic languages are not an exception.

  54. Sabine Cretella
    09 January 2008 17:41

    So to your opinion we should decide if a language is living or not???

    Sorry, but I will not decide something like that - it gets us loads of trouble with many languages. Remeber: we base our decisions on a standard, if this standard is not correct it is the standard that needs amendment and it is not up to us to do this.

    It is not a matter of not knowing us that it is in a certain way, but a matter of not being us to decide upon the fact if a language is living or not. If we start to do such things we will get into endles discussions like when people voted for the creation of a wikipedia (and don't let me talk about those ...).

    So even if whoever brings me evidence of whatever: linguists will decide about it, but not us - we are not linguists, we are just a commission of ordinary people. How can we only think about doing the job of professionals where it is often clear enough that these professionals get into trouble with certain things?

    If the language was living SIL would classify it as living and not as historical. So what needs to change is ISO 639-3.


  55. 09 January 2008 23:34

    <this user has not agreed to public archival.>

Archives up to date[edit]

  1. Jesse Plamondon-Willard (Pathoschild)
    17 October 2007 06:24

    Hello,

    I've finished bringing the public archives up to date at <http://meta.wikimedia.org/wiki/Special_projects_subcommittees/Languages/Archives>.

Wikiquote Limburgish[edit]

The request for a Limburgish Wikiquote was approved.

  1. Shanel Kalicharan
    18 October 2007 15:59

    Hello all,

    Per the message on my talk page<http://meta.wikimedia.org/wiki/User_talk:Shanel#Translation_Limburgish_Interface>, the user interface for Wikiquote Limburgish has been translated. The test project has now met all prerequisites, and can be given final approval. I will send the recommendation to Erik in 48 hours if there are no objections.

  2. Bèrto 'd Sèra
    18 October 2007 16:00

    Okay for me

  3. Gerard Meijssen (GerardM)
    18 October 2007 16:15

    <this user has not agreed to public archival.>

  4. Jesse Plamondon-Willard (Pathoschild)
    18 October 2007 17:29

    Hello,

    The test project is successful; see <http://incubator.wikimedia.org/wiki/User:Pathoschild/Status/wq-li>. The fallback for MediaWiki is English, so they'll need to explicitly set them to what the Dutch messages say. That said, none of the fallback messages are critical, so that's fine.

    They're ready to go. :)

  5. Jesse Plamondon-Willard (Pathoschild)
    25 October 2007 14:19

    Hello,

    Shanel notified the board four days ago, and we've received no objection. I've marked the proposal as approved and notified the developers.

  6. Shanel Kalicharan
    25 October 2007 15:38

    Stealer.

  7. Jesse Plamondon-Willard (Pathoschild)
    25 October 2007 17:09

    You can steal the Bikol Wikipedia approval tonight. ;)

Wikipedia Pontic[edit]

The request for a Pontic Wikipedia was conditionally approved.

  1. Gerard Meijssen (GerardM)
    20 October 2007 19:16

    <this user has not agreed to public archival.>

  2. Jon Harald Søby
    20 October 2007 21:37

    I say aye. No reason not to.

  3. Bèrto 'd Sèra
    21 October 2007 11:53

    ok

  4. Jesse Plamondon-Willard (Pathoschild)
    28 October 2007 21:54

    Conditionally approved.

Novial localization[edit]

GerardM requests that the local translations used by the Novial Wikipedia be exported to the MediaWiki localization files.

  1. Gerard Meijssen (GerardM)
    21 October 2007 06:02

    <this user has not agreed to public archival.>

  2. Jesse Plamondon-Willard (Pathoschild)
    21 October 2007 16:52

    Hello,

    MediaWiki already allows Novial localization, but adding it to BetaWiki will require collaboration with the Novial editors. We don't want to export messages that were written specifically for the Novial Wikipedia, such as references to that wiki's policies.

  3. Gerard Meijssen (GerardM)
    21 October 2007 17:35

    <this user has not agreed to public archival.>

  4. Jesse Plamondon-Willard (Pathoschild)
    21 October 2007 19:15

    GerardM,

    Please check your facts before telling me to check my facts.

    The reason you cannot select Novial in any project is because there are no localization files for Novial. You can select it in BetaWiki and begin creating the localization files: <http://nike.users.idler.fi/betawiki/Special:Translate?task=view&group=core&language=nov>. There is no need to translate again; Novial editors can simply copy and paste, generalizing as they go. If there are Novial editors willing to work on it immediately, Nikerabbit can even import immediately for them to clean up before it is committed.

    Yes, localizations in BetaWiki are regularly committed to the MediaWiki localization files. This means that a bad export will result in a bad interface for that language, which is likely to confuse users (particularly when it tells users that they must follow a policy that does not exist on that wiki).

    What routines do you describe that somehow generalize messages automatically? Where are they implemented?

    As I've said, we can easily add their translations to BetaWiki with a little collaboration with the Novial editors.

  5. Bèrto 'd Sèra
    21 October 2007 19:46

    Hi,

    I don't mean to add gasoline to the fire, but possibly BOTH things may happen. When we went to betawiki we first imported the file from my manual PMS version, THEN I cleaned it up.

    I think it would have been rather time consuming to redo it all (since we translate EVEN the photo messages). Much quicker like that. Many of the messages fall in the "non exportable section", anyway. I seem to remember that policies are not exported from Betawiki, ours surely were not, anyway.

    Still, you do need a native to have a look at it, before exporting it.

  6. Jesse Plamondon-Willard (Pathoschild)
    21 October 2007 20:10

    Berto,

    I agree entirely. If there are native editors willing to clean it up, I see no problem with importing it directly into BetaWiki.

    Either way, this is out of our control; Nikerabbit is the one who can decide and do this.

  7. Gerard Meijssen (GerardM)
    21 October 2007 20:25

    <this user has not agreed to public archival.>

Multilingual MediaWiki update[edit]

GerardM notes that the Multilingual MediaWiki has been added to the Wikimedia subversion server. (It is currently not enabled or necessarily complete.)

  1. Gerard Meijssen (GerardM)
    21 October 2007 06:04

    <this user has not agreed to public archival.>

  2. Bèrto 'd Sèra
    21 October 2007 11:54

    Now that’s good news :-)

Wikipedia Volapük[edit]

No decision was taken on the request for a Volapük Wikisource.

  1. Sabine Cretella
    24 October 2007 05:25

    http://meta.wikimedia.org/w/index.php?title=User_talk:SabineCretella&diff=0&oldid=719526

    just for your info ... well: it is not for langcom to decide, but each one privately can of course express opinion

    ciao, sabine

  2. Gerard Meijssen (GerardM)
    24 October 2007 06:36

    <this user has not agreed to public archival.>

  3. Bèrto 'd Sèra
    24 October 2007 10:58

    Hi!

    There’s an old saying in Piedmont, about those who look for trouble and finally get it. I can expect a good number of such referendums against most botpedias. The language here is but an excuse, what people are wild about is that a basically empty wiki got in the top positions. Since there’s no rule that can explicitly attack this, they attack the language as such. Now this is downright WRONG.

    I’m absolutely in favor of limiting botpedias and avoiding projects that build 100k article each of which is a prime number and has a text like “3: prime number following 3 and preceding 7”, because this is an absolute waste of server space and no more than marketing tricks. But this cannot be hidden behind a linguistic/users’ number issue. They will better set a policy about bot usage.

  4. Arria Belli (Maria Fanucchi)
    24 October 2007 11:31

    (I did not read all the comments on the proposal for closing page, so forgive me if I'm repeating an argument already stated there.)

    I think that the problem with the Volapük Wikipedia is not so much the fact that only around 25-30 people can speak it, but that only 25-30 can speak this constructed language. A lot of the people on the proposal page mention the number of speakers, but I think that's not really what's bothering them. I think they would probably vote for conserving another similar Wikipedia with just as many speakers because they would care more about natural languages that are in danger of extinction than about a constructed one. This doesn't mean they want to shut down all WPs in contructed languages (no one is calling for the closing of the Esperanto one, for instance), just those that are less well-known and spoken.

    Does my point make any sense or is it too early in the morning for me to try to reinvolve myself in the LangCom? -__-

Wikisource Armenian[edit]

The request for an Armenian Wikisource was approved.

  1. Shanel Kalicharan
    26 October 2007 16:35

    Hello all,

    The interface has been translated, and all the prerequisites have been fulfilled. The project can now be given final approval.

  2. Bèrto 'd Sèra
    26 October 2007 16:51

    Ok for me as soon as the UI is in the distro, i.e., possibly already now.

  3. Jesse Plamondon-Willard (Pathoschild)
    26 October 2007 17:35

    Hello,

    I object to its final approval; my analysis of the test project shows that there is only one active editor, which isn't enough to ensure the project's stability. If that single editor becomes inactive, the project will become another empty wiki that stewards and the small-wiki monitoring volunteers need to babysit.

    The request is located on the multilingual Wikisource; unlike the Incubator, this does not inhibit a small community in any way. They can link to Armenian titles exactly as they normally would (without a project prefix), and interlanguage links work. They benefit from the combined multilingual care of that wiki's administrators, so spam and vandalism are not serious problems.

    Once the community becomes a little bigger, I will be all for approval.

    <http://incubator.wikimedia.org/wiki/User:Pathoschild/Status/ws-hy#Analysis>

  4. Bèrto 'd Sèra
    26 October 2007 17:41

    Hi,

    I wasn't aware they are that small. Yours looks as a sensible position for me. I retire my approval.

    Berto 'd Sera

  5. Jesse Plamondon-Willard (Pathoschild)
    30 October 2007 00:26

    Hello,

    <http://meta.wikimedia.org/wiki/User_talk:Pathoschild#Armenian_Wikisource>

    The user would like to know what other members of the subcommittee think; comments welcome.

  6. Bèrto 'd Sèra
    30 October 2007 11:26

    Hi!

    On one thing he is right: wikisource cannot be expected to have the same amount and continuity in contributions as a wiki. Yet, they DO need an administrative community to be in place. When they say that one person was doing the localization and the other was adding material they show that such a minimal community is taking shape. This IS a positive sign, yet it takes a bit more than that. Wikisourcing means instructing people to use scanners, among the rest. It really doesn't need MANY active contributors, but it does need a good help-desk to run properly.

    The rest is a bit weaker and possibly just emotion-driven. We have not rejected the project; we keep it incubating, as we do with everyone else, until we are not satisfied.

    The objections against a common wikisource are equivalent to saying that "commons" should be split. While I understand that wikisource may need a better way to serve proper "main page" content on automated basis (for example, many browsers will send a request for "preferred content") I quite reject the idea that the only way I can have a pms.wikisource is opening such an independent project.

    Small wikies often have JUST the human horse-power they need to exist, breaking the community in 2 or more projects doesn't seem brilliant to me. If using commands in English is an issue, then waiting for MLMW may be the best answer. Both for Commons and wikisource.

    Localization is going to be needed EITHER way and it quite an oxymoron when many people refuse a single wikisource because it is not localized and at the same time they often end up leaving 70% of their messages in English. I'm not saying this is the case with Armenians, yet in practice it happens quite frequently and it's matter for thought.

  7. Bèrto 'd Sèra
    30 October 2007 11:39

    Hi,

    "So the application of the criteria (analysis of activity during the last 30 days) does look arbitrary in this case"

    This is another problem. The final criterion IS arbitrary, since we actually judge based on a number of issues, part of which are project-specific and others are culture-specific. Yet when we use arbitrary criteria we should state so.

    Also, project-specific criteria should be stated clearly once and for all, since they are NOT arbitrary (cfr, wikisource and scanning help for the local extension, etc).

    I guess this is a chance to involve the armenians in the discussion and use tham as a template to originate a written policy for wikisource.

  8. Gerard Meijssen (GerardM)
    30 October 2007 11:56

    <this user has not agreed to public archival.>

  9. Jesse Plamondon-Willard (Pathoschild)
    31 October 2007 01:21

    Hello,

    GerardM's area of concern (ISO codes) has been addressed, but not mine (activity). Three editors is a strict minimum in my opinion; it's encouraging that there were three editors a few months ago, but that doesn't mean anything if they're gone now.

    A community is the single most important aspect of a wiki; localization only empowers the community, and a language code only identify the language and filter down requests for us. Without a community, there is no need for a wiki. There are far too many wikis that were created without active communities in the past, and the stewards and small-wiki monitoring team are still dealing with the mess every day.

    I'll be perfectly willing to give my support once there is some consistent activity *today*.

  10. Shanel Kalicharan
    31 October 2007 03:13

    Even if we required a certain amount of editors, doesn't guarantee that they won't abandon the project later. Besides, I think it's unreasonable to expect every test project to be consistently active, especially if it's not a widely-spoken language, or one where most of the speakers would not have a computer or internet access. *We're* not consistently active. :P Armenian is spoken by about 7 million people, according to Wikipedia, but I doubt Armenia is a particularly wired country. Their Wikipedia seems fairly quiet too (about 20-50 edits/day, by a small group of registered users and a few anons), but it's doing fine. A Wikisource is even less trouble, and I personally think it's not as big of a deal if they have one editor who is active at the moment.

    On a related note, teak (the user who completed the Armenian localization), wants to know if it would be possible if he can be a "guest" on this list (we can cc to him). He would like to know how this subcommittee works, and has some ideas on how to make the process of getting a project approved less stressful. Considering we've never, to my knowledge, gotten input from users who have been through the process, I think it would be valuable. He would only want to after we have dealt with the Armenian Wikisource, of course, but would this be ok with everyone else?

  11. Gerard Meijssen (GerardM)
    31 October 2007 04:55

    <this user has not agreed to public archival.>

  12. Bèrto 'd Sèra
    31 October 2007 07:49

    No probs

  13. Arria Belli (Maria Fanucchi)
    31 October 2007 12:44

    Why not? It couldn't hurt to try, I think.

  14. Gerard Meijssen (GerardM)
    03 November 2007 19:02

    <this user has not agreed to public archival.>

    ---------- Forwarded message ----------
    From: viKtor <email censored>
    Date: Nov 3, 2007 7:48 PM
    Subject: Fwd: On Armenian Wikisource request
    To: GerardM <email censored>

    Thanks for forwarding :)

    -Teak
    ---------- Forwarded message ----------
    From: teak <email censored>
    Date: Nov 1, 2007 1:41 PM
    Subject: On Armenian Wikisource request
    To: <email censored>
    Cc: teak <email censored>

    Hello all,

    After numerous discussion with Pathoschild and Shanel, and learning
    that there is currently a deadlock about the final approval of ws:hy,
    I decided to write to the list, and "present my case" (the possessive
    pronoun "my" should be understood in the broader sense of Armenian
    Wikisource's :). The latest discussions haven't been publicly archived
    yet, so I apologize if I happen to repeat anything that's already been
    said and discussed.

    So what is Armenian Wikisource so far:
    Armenian texts started to appear in oldwikisource in the fall of 2006,
    more than a year ago. This first contributor was Վազգեն։
    http://wikisource.org/wiki/User:%D5%8E%D5%A1%D5%A6%D5%A3%D5%A5%D5%B6,
    who started adding texts anonymously at first, but registered in
    November 2006, and to date has done over 780 edits in oldwikisource.
    http://wikisource.org/w/index.php?title=Special:Contributions​&offset=20070401010248&limit=250&target=%D5%8E%D5%A1%D5%A6%D5%A3%D5%A5%D5%B6
    I myself joined the project on June 1st, and am author to ~ 350 edits:
    http://wikisource.org/w/index.php?title=Special:Contributions​&offset=20070613042952&limit=100&contribs=user&target=Teak
    A third active participant joined in mid July, and has so far done
    about 100 edits:
    http://wikisource.org/w/index.php?title=Special:Contributions&limit=100​&contribs=user&target=%D5%8D%D5%A1%D5%B0%D5%A1%D5%AF
    The request for a separate domain was filed on June 8 by a 4th contributor
    http://meta.wikimedia.org/w/index.php?​title=Requests_for_new_languages/Wikisource_Armenian&action=history
    who has done only a few anonymous edits on oldwikisource.

    All the 4 contributors have been with the Armenian Wikipedia for at
    least 1.5 years, and two of them - myself and VRuben (he filed the
    request) are sysops on wp:hy.

    The test project was very active in June-August, and we hit 400 texts
    by mid August (from the history of the request page you can see the
    summaries of my edits about additional 100 texts by dates).
    In late July I comunicated with Bèrto 'd Sèra on an unrelated issue
    (the logo on wp:hy), during which I also asked about an approximate
    timeframe of language request processing. His response is still on my
    wp:hy talk page:
    http://hy.wikipedia.org/wiki/%D5%84%D5%A1%D5%BD%D5%B6%D5%A1%D5%AF%​D6%81%D5%AB_%D6%84%D5%B6%D5%B6%D5%A1%D6%80%D5%AF%D5%B8%D6%​82%D5%B4:Teak#Langcom_timing

    I was away in the second half of August, and came back in the
    beginning of September to see that no action whatsoever was taken on
    the request, that by this time was almost 3 months old. This had the
    most discouraging effect on me, and I'd think on every other
    contributor. During the period of mid August to mid-October, only
    about 25 new texts were added to the test project, and all by myself.
    After leaving a note on the talk page of langcom's page on meta on
    September 22:
    http://meta.wikimedia.org/wiki/Talk:Special_projects_subcommittees/Languages#Wikisource_Armenian
    I contacted several members of the langcom on their meta talk pages,
    who I figured were the most active. Finally the request was
    conditionally approved on October 7, at which time the only condition
    for final approval was the localization (keep in mind that at this
    time the test project hadn't had a preceding month of regular
    activity).
    http://meta.wikimedia.org/w/index.php?title=Requests_for_new_languages/Wikisource_Armenian​&diff=700839&oldid=649463

    After the conditional approval note, I jumped into the localization,
    and promptly completed it around October 24th. On October 26th
    Pathoschild modified the request's page to include a link to the test
    project's statistics summary:
    http://meta.wikimedia.org/w/index.php?title=Requests_for_new_languages/Wikisource_Armenian​&diff=724738&oldid=700845
    in which now an additional condition of "develop an active test
    project" was added:
    http://incubator.wikimedia.org/w/index.php?title=User:Pathoschild/Status/ws-hy&oldid=131773

    Today we have a test project that contains over 430 texts, 3 active
    contributors, one of which was rather active in the last month (the
    very first contributor of Armenian texts to oldwikisource), the other,
    myself - was active during the last month localizing the interface,
    and the third active contributor to the project has been continually
    active on wp:hy, and is back editing in the test project on
    oldwikisource. So this brings us to the requirement of a month's worth
    of regular activity, which Pathoschild has brought up in our IRC
    communications numerous times...

    Before proceeding to explain my point of view as to why I think the
    request should be approved now, with no further conditions, I want to
    let everyone know that I myself believe that an active (successful)
    test project is the foremost indicator of future prosperity of the
    actual project. And coming up with the required activity numbers for a
    period of one month is not an insurmountable difficulty for this
    project, and even quite the opposite.
    So why shouldn't an additional month of regular activity be required
    before the ws:hy can be approved:

    1. Firstly because it's plain unfair to the project and its
    contributors! Were we told at the time of conditional approval that
    the activity is an issue, I would have concurrently worked on both the
    localization and delivering activity statistics (by editing myself,
    and getting more users to contribute), and not end up in a situation
    where we now need to show an additional month of activity.

    2. Why is activity required in the first place? My understanding
    always was that the activity should show that a) the requesting group
    is familiar with how Mediawiki's wikis work in general (templates,
    categories, NPOV, copivio, etc); b) the language used in the test
    project is the one the request is made for; c) the contributors can
    work together, and not end up in an atmosphere of distrust and
    continuous edit wars, that in the end will spill into others'
    headaches; d) things can be done in general.
    All of this was demonstrated by ws:hy test project. We showed that
    things can be done, and done smoothly, by calmly accumulating 400+
    texts. All of the contributors are long time wikipedians, so the
    technical points are not an issue in this case, and the language...
    well, you can check on that. ;)

    3. Shanel have asked me in IRC what would happen to the project if I
    get hit by a train, so I assume this question also would have come up
    with others, since Pathoschild's original message with objection led
    people to believe that this was a one-man show. Well I'm not the one
    active contributor in Pathoschild's statistics, and the recent changes
    in oldwikisource show that there are 3 contributors in Armenian within
    the last week, all of whom have been with the test project for months:
    http://wikisource.org/wiki/Special:Recentchanges

    4. The previous question also chimes with Pathoschild's opinion that
    inactive projects become a burden for the Small Wiki Monitoring Team.
    Well, our (not mine alone!) continuous contributions to the well being
    of ws:hy during the 5 month period of development should be enough
    indication that this project is not one to end up abandoned. And in my
    opinion this is a slippery slope question in itself, since denying a
    project it's existence because of potential future misdeeds is
    contrary to presumption of innocence, which is an expected norm in a
    democratic society (unless this is a "track record" check, which we
    have demonstrated over time, and making the bureaucratic requirement
    that only the activity in the last month should count is... well,
    bureaucratic). The issue of abandonments should be dealt after the
    fact, not before, and, I believe , via an expanded set or rules than
    it's currently done. Say, locking the db, if SWMT reports absence of
    any positive activity within the last month, which can then be
    unlocked once an interested party requests it. Right now the closure
    procedures are clumsy and very inefficient, hence the fear, which adds
    unnecessary complications to langcom's procedures.

    5. I've always refrained from promoting the test project too much,
    since I didn't want to confuse potential contributors by the switch to
    a different domain later. I'm of the opinion that most of the
    contributors shouldn't have the need to know all the bureaucracy and
    the technical bits of how a wiki operates (this is one of the reasons
    I'm the only one of the test project knocking on doors). And my
    thoughts (as well as others' with whom I communicated about the
    project) was to do what it takes to deliver a project in it's final
    standing (it's own domain, policies on copyright - one of the reasons
    why Armenia's law on copyright was one of my first contributions to
    the project, etc.), and promote the hell out of it, so that regular
    users would come and have a joyful time contributing and seeing the
    project grow. And this is another reason why I don't want to invite
    too many (potentially inexperienced) users at this stage, as
    Pathoschild suggested. We've had a hell in wp:hy because of the mess
    left by the early contributors of the project (which we still try to
    clean up), and the last thing I wanted to do is to have the same thing
    repeated on ws:hy.


    So there you have it. With all this, I ask you to consider approving
    the request for a separate domain for Armenian Wikisource. Please let
    me know if you would need more info on anything, and please do
    consider cc'ing me your responses. Thanks

  15. Jesse Plamondon-Willard (Pathoschild)
    03 November 2007 19:24

    Hello,

    I'll abstain from discussing this request further; I have already explained what I think should be done before it is approved.

    As for Teak's message, once my analysis script has access to the toolserver database, it will be able to provide much more thorough statistics (such as monthly activity and possibly graphs of editing trends); the one-month limit is a purely technical one (Special:Recentchangeslinked is limited to one month).

    One thing that I think is important to do is track the progress of projects *after* we approve them. Which became inactive later, and why? Did a request we had poor hopes for prosper, and how can we adjust the policy to facilitate the process for those? So far the policy is entirely based on what happens before approval, and we're just extrapolating what encourages a wiki to do well.

  16. Gerard Meijssen (GerardM)
    03 November 2007 21:21

    <this user has not agreed to public archival.>

  17. Bèrto 'd Sèra
    03 November 2007 21:59

    Jesse, it's all nice, but aren't we biting more than we can chew? We just have the forces needed to analyze the formal aspects of a request and check the results once they pretend for final release, I do wonder how you can track projects after they are released.

    Personally, I don't think we should have ANY hopes, and even less check them. We say a pre-yes\no based on formal requirements, check the quantities once they say they feel they made enough and that's it.

    Each culture has its own situation and road. There is NO such thing as a general receipt for happiness. What can make me and you feel great may kill somebody else, and we have no way whatsoever to know what these peoploe are confronting, on a marketing plan.

    Basically, all we can tell them is all everybody knows: "get as much press as you can" and "avoid turning your project into a dictatorship". But HOW can you check this? All you have an OBJECTIVE access to is results.

    If we want to fix some minimal quantitative indicators per project, than okay, let's do it. But be aware that people can decide to use the "preview" button much less frequently and make much better results by saving whatever they do. They can make anon edits from a floating IP... lots of stuff. How will you check?

  18. Jesse Plamondon-Willard (Pathoschild)
    04 November 2007 00:55

    GerardM,

    Statistics help us make decisions. You are satisfied with their past effort; I'm concerned with the present lack of activity. This has nothing to do with strict adherence to objective criteria, but is based on my experience with dead wikis.

    As I've said, I'm not going to oppose this wiki; I intend to track its progress after its creation to determine how the criteria in the policy can be changed to better reflect real needs.


    Berto,

    You say "But HOW can you check this? All you have an OBJECTIVE access to is results", but that is exactly what I intend to do; measure growth and feasibility by tracking the results and generating comprehensive statistics. This is actually easier to do after we create the wiki, because it's not necessary to distinguish changes to other projects.

    The statistics are general tools; they show trends and help us make decisions. Is a project growing or declining, and why? How many new pages were created, and how much information is added per edit (daily, monthly, yearly)? How much raw text was added or removed in the last month? How many registered editors make how many edits how often to how many pages? How many editors are anonymous from what address ranges? These and other statistics are all indicators of activity.

    We're not biting more than we can chew. This is the one of the subcommittee's primary goals: to ensure, through an objective policy, that approved projects prosper. This is stated right in our founding charter:
    "The development and maintenance of ... a clear step-by-step policy (based on quantitative indicators) for evaluating the feasibility of new language wikis, with an automated procedure for project development".

    We can only develop an objective policy if we know *why wikis prosper*. It's pointless to add a new criteria if it doesn't help them prosper; is three editors a good minimum community for a Wikipedia? What about for a Wikisource? Should it be higher, or lower? Should we take other factors into account, or remove criteria that don't affect feasibility?

    Right now, many of our decisions are based on simple gut feelings, and they are frequently biased and preferential. We tend to approve projects promoted by those we know or talk to (Cary or Teak, for example), while ignoring those that productively follow the procedure.

    Anyway, this discussion is largely academic. The statistics are entirely unofficial, and will be done with or without subcommittee approval. They do not affect a project's approval, except in so far as they measure conformity to the policy and inform our judgment.

  19. Bèrto 'd Sèra
    04 November 2007 01:24

    Hi,

    The problem is how credible statistics are in general. How will you tell a botpedia using a floating IP from a real pedia made of (say) 2-3 regular users?

    Most of the projects we are launching cannot hope to get any better than that, IF they are lucky. When we started PMS.wiki it was 2 of us. I never cared much for content, basically my role is localization plus some work on filology and history of the language, whenever I have time for that (i.e., almost never). After an year and a half it's 5 admins, which also are the 5 most recurrent users, plus 3-4 more that come and go. One of them is also the guy who writes the best content we have to offer. He does very little quantity and a LOT of quality. How will you measure this?

    But let's stay on brute quantity. The only reason why we got to 10K articles is that we got LOTS of press exposure. Those who did not get that much attention lay way under us. HOW will you measure their ability to connect with the press? By counting the number of edits? What do they tell you? Will you use the number of 1-line long articles? The number of non-article pages? We never used the village pump, because mediawiki is absolutely useless as a forum. In instead, we mail each other, much as we do here, for the very same reason.

    I mean, numbers are okay, but which numbers are you going to use, and what meaning are you going to give them? Based on what? I haven't seen a single stat about wikis that got any further than counting the titles and the edits, plus the recurrent users. The only REAL relevant quantity based data (incoming read traffic) is something we totally miss.

    How successful is a project that writes a lot of crap that nobody reads? We'll never know, because the only data people care about is the top-10 position of en.wiki. Let's face it. We have NO way to define if a project is successful or not, especially when dealing with small linguistic entities.

  20. Gerard Meijssen (GerardM)
    04 November 2007 07:24

    <this user has not agreed to public archival.>

  21. Jesse Plamondon-Willard (Pathoschild)
    04 November 2007 09:05

    Berto,

    Statistics are a tool, not a magic bullet. Statistics can't say "this project is successful"; community size can't say that, article count can't say that, press coverage can't say that. There's no automated way we can look at a project and say, "this project will succeed".

    What we *can* do is look at all the factors, and say "this project has a good chance of succeeding based on our previous experience". Based on my experience, a project with three active editors has a good chance of succeeding, because those editors will probably stick around and continue editing (and bot editors are easy to recognize and discount). Three is just enough for a balanced community.

    A project with less than two active editors has little chance of succeeding. You mention pms-Wikipedia as a success story, but see <http://meta.wikimedia.org/wiki/Inactive_wikis> for the many more failures. What they and many other failed wikis have in common is that they were started with insufficient community. That is the deciding factor for how well a wiki does, not press coverage or localization or having a committee approve it based on whether another committee created a code to refer to the language.

    Just one example is ak.wikipedia (the third on the list); it has 34 articles written by a community now unknown. Over the last month, there have been zero article edits, but several vandal edits. The recent changes list is entirely filled with vandalism and spam, steward interventions, and unflagged interwiki link bot edits.

    Creating any wiki with one or two active editors and hoping it will work out some day simply doesn't work. Many of those wikis were created years ago; many of the inactive wikis listed on <http://meta.wikimedia.org/wiki/Wiki_activity_statistics> in 2004 are still inactive today, nearing 2008.

    What you can can see on the status pages now are only the very first primitive statistics; within a few months, they will be much more comprehensive and informative, and I'll track both test projects and newly approved wikis. Over time I hope to develop some objective criteria based on these observations to improve the policy, and where possible have a few more objective criteria and a few less based on how good we feel that day. If you don't think that's possible, think of it this way: it's my free time to waste as I see fit.

    If you don't think the statistics are relevant, that's okay. You're free to use your own judgment based on your own experience. I think the statistics are important to inform my opinion, based on my experience, of the project's feasibility.

    Thus, I object to the wiki's creation based on my experience. I'm not pushing my statistics on anyone, any more than GerardM is pushing his gut feelings on me because he approves based on them.

    If we can't reach a consensus, let's just put it to a vote, probably approve the wiki, and I will track its progress. Maybe in a month or two I'll say "I told you so", or maybe I'll have a different opinion when a similar request comes up.

  22. Bèrto 'd Sèra
    04 November 2007 13:20

    Hi!

    I'm changing the thread name, since this is a general issue, not really a strictly Armenian and/or Wikisource issue. I'm also avoiding the word "successful". I do it on purpose.

    I'm aware of the spam problem. I clear up spam links to tourist sites from pms.wiki each every day and I agree that this is a major waste of time. In this sense I can understand your worries.

    I'm asking myself whether we shouldn't route small projects to a general shared container, a sort of "incubator phase 2", in which they could grow while already being public AND sharing the patrolling functions.

    Common Wikisource IS such a structure, once we get MLMW there may be an equivalent common wiki, too.

  23. Gerard Meijssen (GerardM)
    04 November 2007 14:42

    <this user has not agreed to public archival.>

  24. Jesse Plamondon-Willard (Pathoschild)
    04 November 2007 23:50

    Hello,

    We're obviously not going to reach a consensus, so I'm going to put this to a straight vote. I'll carry out the majority decision in 72 hours. So far (feel free to change):

    • supporting are Berto and Shanel;
    • no opposing;
    • abstaining are GerardM and myself.
  25. Gerard Meijssen (GerardM)
    04 November 2007 07:24

    <this user has not agreed to public archival.>

  26. Jesse Plamondon-Willard (Pathoschild)
    CC Erik Möller (Executive Secretary)
    12 November 2007 04:41

    Hello Erik,

    The language subcommittee recommends the creation of an Armenian Wikisource. Further information can be found by following the links below. The request will be approved and created if the board does not object within four days, as previously agreed and described in our charter.

    More information:

  27. Gerard Meijssen (GerardM)
    04 November 2007 07:24

    <this user has not agreed to public archival.>

  28. Jesse Plamondon-Willard (Pathoschild)
    17 November 2007 23:13

    Hello,

    Shanel has approved the request, and I filed the report. See <http://meta.wikimedia.org/wiki/Special_projects_subcommittees/Languages/Schedule> for a list of approved wikis awaiting creation.

Wikipedia Latgalian[edit]

The second request for a Latgalian Wikipedia was rejected.

  1. Jesse Plamondon-Willard (Pathoschild)
    26 October 2007 19:35

    Hello,

    I propose we reject the request for a Latgalian Wikipedia, as Latgalian does not have an ISO code. They can make a new request when the language becomes suitable.

    Ideally, we should link them to (or create) a page with summarized information on what an ISO code is, what the requirements are, and how to apply for a code.

  2. Jesse Plamondon-Willard (Pathoschild)
    26 October 2007 19:37

    PS: The request page is at <http://meta.wikimedia.org/wiki/Requests_for_new_languages/Wikipedia_Latgalian_2>.

  3. Jesse Plamondon-Willard (Pathoschild)
    26 October 2007 21:36

    I'll implement the decision in 48 hours if nobody objects, as usual, so please say so if you need more time to formulate your objections.

  4. Shanel Kalicharan
    26 October 2007 23:30

    If the only problem is lack of an ISO code, can we do something other than outright reject them? I can't imagine rejection would be good for their enthusiasm about the project.

  5. Bèrto 'd Sèra
    27 October 2007 08:19

    No, there’s absolutely no way to do anything apart from “no”. Yet individual members may want to contact them and help them in their contacts with ISO. I still think that we would never have fallen in such a situation if we had accepted requests with a ticketing system.

  6. Jesse Plamondon-Willard (Pathoschild)
    27 October 2007 23:23

    Shanel,

    Most requests without an ISO code don't have one simply because they weren't considered, so getting a code assigned may be as simple as submitting a request with appropriate evidence. In this case, however, the request apparently *was* considered, and it was rejected as being part of Latvian.

    I don't mean to be cynical, but we shouldn't encourage their enthusiasm for a project we won't allow anyway. I think rejection is the only good result in this case; the request can be reopened if the code situation changes in the future.

  7. Shanel Kalicharan
    28 October 2007 00:11

    You're always cynical, so there. I wish there was something we could do, but if not then I guess the best thing is to reject. I still feel really bad for them. :(

  8. Jesse Plamondon-Willard (Pathoschild)
    28 October 2007 05:55

    Hello,

    I've rejected the request.

  9. Gerard Meijssen (GerardM)
    28 October 2007 06:00

    <this user has not agreed to public archival.>

Wikisource Min Dong[edit]

No decision was taken regarding the request for a Min Dong Wikisource.

  1. Jesse Plamondon-Willard (Pathoschild)
    27 October 2007 23:18

    Hello,

    I propose the rejection of the Min Dong Wikisource. There is currently no active community, the test project is inactive and has only one article, and none of the interface is localized. The community has been inactive for nearly a year.

    Although it has an ISO code, we should not conditionally approve requests pre-emptively without an editing community because our criteria can change. For example, the Ottoman Turkish could have been conditionally approved months ago because it has an ISO code, but that would be revoked now with the new rule against historical languages.

    A rejection with a comment explaining why it was rejected, and inviting users to form a new community and resubmit the request is the best course of action.

  2. Shanel Kalicharan
    28 October 2007 00:12

    No interested community at the moment, so I agree with rejecting this request.

  3. Bèrto 'd Sèra
    28 October 2007 00:16

    We never put restrictions on time. What's the added burden in keeping it open?