Requests for queries/Archives

From Meta, a Wikimedia project coordination wiki


To do[edit]

Query Log[edit]

I'm doing research on DB schema evolution and using wikipedia as testcase. I'm interested in the query log of one of the wikimedia wikis, let's say en. Any size up to 10Gb of this log can be interesting.

If this is not the right place to ask please redirect me.

thanks Krlis1337 18:07, 5 September 2007 (UTC)[reply]

Data dumps#What happened to the SQL dumps?, current data dump. Looks like they're not around anymore, if I'm reading this right. ~Kylu (u|t) 00:16, 7 September 2007 (UTC)[reply]


I guess I was not clear... I'm not referring to the dumps of the DB... I actually need the log of the queries posed to the DB, i.e., the list of queries the DB need to serve during its normal working. Typically the query_log of the MySQL DBMS. Is this available?

thanks Krlis1337 10:25, 16 October 2007 (UTC)[reply]

Track Andres.miranda1 (talk) 19:14, 24 September 2023 (UTC)[reply]

Update this for new formats on download site[edit]

The formats for the sql files have changed and the documentation above for offline querying is out of date, this documentation needs to be updated to use the newer file names.

Email address on kw.wiktionary[edit]

Can someone set my email address on kw.wiktionary for kw:wikt:User:SunStar Net to sunstarnet.wikipedia@googlemail.com for me to get a new password?? --WiganRunnerEu 08:00, 8 May 2007 (UTC)[reply]

That's not a database query request. Cbrown1023 talk 20:31, 6 October 2007 (UTC)[reply]

Interwiki link counts[edit]

Can someone get me the output for the following query for every wikipedia against every other wikipedia ? It's an urgent request, please. -- Sundar 09:02, 4 August 2006 (UTC)[reply]

Whoever runs the quey, I'll be grateful if you can also drop a message in my talk page or email me. -- Sundar 09:06, 4 August 2006 (UTC)[reply]

Example query:

select * from cur 
where cur_namespace=0

Namespace title translation[edit]

There are still many articles in Russian Wikipedia, where English namespace names are used in links (Eg. "category" instead of "Категория"). It would be most easy to correct this with SQL queries. I guess, the queries would be as follows:

UPDATE cur
SET cur.cur_text = replace(cur.cur_text, '[[Category:','[[Категория:')

and also:

SET cur.cur_text = replace(cur.cur_text, '[[:Category:','[[:Категория:')
SET cur.cur_text = replace(cur.cur_text, '[[Image:','[[Изображение:')
SET cur.cur_text = replace(cur.cur_text, '[[Template:','[[Шаблон:')
SET cur.cur_text = replace(cur.cur_text, '[[User:','[[Участник:')

Thanks in advance. --CodeMonk (admin of ru.wikipedia.org) 22:40, 17 November 2005 (UTC)[reply]

Stubs mentioning Croatia[edit]

Could you please run this query on en: wikipedia:

SELECT cur_title
FROM cur
WHERE cur_namespace=0
AND cur_text LIKE '%stub}}%'
AND cur_text LIKE 'Croatia'
AND cur_text NOT LIKE '{{Croatia-stub}}'
AND cur_text NOT LIKE '{{Croatia-bio-stub}}'
AND cur_text NOT LIKE '{{Croatia-geo-stub}}';

This should return names of stubs which mention Croatia but are not categorised to any of Croatia-related stubs. You can put the result on en:User:Dijxtra/SQL. I'd be veeeeeeeeeeeeeery helpful for Croatian stub sorting effort. Thanks. -- Dijxtra, 20:25, 16 September 2005 (UTC)[reply]

In the process of doing. --AllyUnion 04:30, 29 December 2005 (UTC)[reply]
The query is updated to conform to new database schema and expanded on other countries from former Yugoslavia:
SELECT page_title
FROM page, revision, text
WHERE page.page_id=revision.rev_page
AND revision.rev_text_id=text.old_id
AND page.page_namespace=0
AND page.page_is_redirect=0
AND text.old_text LIKE '%stub}}%'
AND text.old_text LIKE '%Bosnia%'
AND text.old_text LIKE '%Croat%'
AND text.old_text LIKE '%Macedonia%'
AND text.old_text LIKE '%Montenegr%'
AND text.old_text LIKE '%Serb%'
AND text.old_text LIKE '%Sloven%'
AND text.old_text LIKE '%Yugoslav%'
AND text.old_text NOT LIKE '%{{BiH-%'
AND text.old_text NOT LIKE '%{{BosniaHerzegovina-%'
AND text.old_text NOT LIKE '%{{Croatia-%'
AND text.old_text NOT LIKE '%{{Serbia%'
AND text.old_text NOT LIKE '%{{Slovenia-'%;
Note that there's no dash in '%{{Serbia'% on purpose. --Dijxtra 22:47, 17 January 2006 (UTC)[reply]

Passwords/email of Guanaco's accounts[edit]

Please set the password of all accounts on all wikis named "Guanaco" or "Guanabot" to my password on enwiki or meta (whichever is convenient). Also set the email address to "guanaco@cox.net" on all these accounts. Thanks, Guanaco 05:16, 3 September 2005 (UTC)[reply]

The password part is not possible to carry out in a reasonably timeframe due to the way MediaWiki password hashing works.

English-language Wikipedia; list of non-surname alphabeticised British MPs[edit]

I would like a list of all British MPs (categorised into Category:British MPs), who are not entered alphabetically by surname. I believe the correct query is this:

 SELECT c.cur_title FROM wikipedia.cur c
 WHERE c.cur_text LIKE "%[[Category:British MPs]]%";

Please put the results at en:User:Talrias/British MPs. Thankyou, Talrias 22:30, 3 Jul 2005 (UTC)

Count articles in en:Category:People stubs (completed)[edit]

select left(page_title,1),count(left(page_title,1)) from categorylinks,page where page_id=cl_from and cl_to='People_stubs' group by left(page_title,1) order by left(page_title,1);

The results of this query can be placed at en:User:Joy/somewhere and then I'll integrate them into the nice table at en:Category:People stubs. --Joy 01:35, 2 Jul 2005 (UTC)

It can also include something like "FORCE INDEX (id_title_ns_red)" but I'm not sure what exactly that does, another user supplied this. I've initially been working off a simpler select that dumped all the article names. --Joy 01:35, 2 Jul 2005 (UTC)

Done by Brion (completed) --AllyUnion 04:29, 29 December 2005 (UTC)[reply]

Fix number of total articles in Portuguese Wikibooks[edit]

Please, fix number of total articles in Portuguese Wikibooks. Lots of bad articles are deleted in last months, but the number not reduces.

Fix sort keys for subcategories in Danish Wikipedia[edit]

The Danish Wikipedia contains a number of categories where the listed subcategories is not sorted correct (this is not the known problem about collation sequence not being localized). It is caused by the sortkey being prefixed by the category namespace name ("Kategori") in many cases (to be more precise: 720 cases in the latest dump of the categorylinks table.) The sorting problem is very clear at for example da:Kategori:Danskere and da:kategori:Europæere. I believe it can be fixed by executing this SQL query:

 UPDATE categorylinks
 SET cl_sortkey = SUBSTRING(cl_sortkey,10)
 WHERE cl_sortkey LIKE 'Kategori:%';

Thanks. Byrial 17:13, 22 Nov 2004 (UTC)

Update webalizer statistics for English Wikipedia[edit]

I would like to have the full webalizer statistics of September 2004 and October 2004 for the English wikipedia, so I can update the relevant statistics pages. Can someone with access to the logs generate these reports? Thanks in advance. Sietse 15:06, 17 Nov 2004 (UTC)

Not really. The reason they aren't being done is the load kills the whole site. They'll probably be back once a better way to do it is found. —Kate | Talk 02:55, 2004 Nov 18 (UTC)
Webalizer actually just generates static HTML pages and images (for the bar charts). Surely this could be done easily by copying the logs off the server onto some other box and just running webalizer there? Then copy the resulting HTML and images back onto the server. josh_AT_mirtec_DOT_net
Copying the logs off the live server (remember we generate 2GB of logs per day) is one of the problems in the first place; we also don't have servers lying around to run random things on ;) However, it will be back eventually. Probably. —Kate | Talk 15:09, 2004 Dec 29 (UTC)
I would be willing to run webalizer on these logs, if someone were to make them available to me. I have a proficiently fast enough machine to process this information, however, it will probably not be done overnight. If someone can direct me to a download link to the server logs, I can get started on running webalizer for you. If you have questions, please respond on my enwiki talk page at this link Phuzion 06:15, 19 October 2007 (UTC)[reply]

Refresh the special pages on Chinese wiki (zh)[edit]

Please refresh the special pages that came from cache. From what I tell, the oldest to newest special pages are:

  1. Special:Deadendpages
  2. Special:Lonelypages
  3. Special:Uncategorizedpages
  4. Special:Wantedpages

Please let me know if these should be handled with offline reports like those on the English wiki, or if there is something that I can do as a sysop on the Chinese wiki. --Vina 04:22, 6 Oct 2004 (UTC)

Refresh the special pages on Catalan wiki (ca)[edit]

Oh please, it's possible to update this pages automatically, at least, once a week? Our users are frustrated about this pages... Joanjoc 23:02, 11 Dec 2004 (UTC)

Special pages are refreshed automatically. If there is a problem, try a w:hard refresh. Thunderhead 20:37, 27 October 2007 (UTC)[reply]

Untagged images[edit]

Please give me a list of all untagged images that I and dpbsmith have uploaded to en. →Raul654 02:23, Sep 25, 2004 (UTC)


Untagged images[edit]

Please can I have a list of all images not tagged on the English and French Wikipedias. According to en:Wikipedia:SQL query requests, the following query should do it. Thank you. Angela 21:03, 28 Jul 2004 (UTC)

On second thoughts, maybe that's not the right query. I guess it should include all the tags mentioned at en:Wikipedia:Image copyright tags, and I don't yet know what the tags are in French. Angela 03:04, 29 Jul 2004 (UTC)
Perhaps every image not containing '{{' in the description would be sufficiently close? Jamesday 13:42, 29 Jul 2004 (UTC)
select i.img_name from image i
 where not exists (
       select 1 from cur, links where cur_title in
         ('GFDL', 'Fairuse', 'Copyrighted', 'Unverified', 'Unknown', 'Verifieduse',
          'Noncommercial', 'PD') and cur_namespace = 8 and l_to = cur_id 
           and l_from = concat('Image:', img_name)
       )
changed to avoid the *

Watchlists[edit]

  • Articles on 0 watchlists
  • Articles on fewer than 2 watchlists +sj+
    this should also catch articles only ever edited once.
    I created a list of the oldest "new" articles at de. Maybe the query (which is language independent) is of interest for someone. --SirJective 12:52, 6 Nov 2004 (UTC)

How about this ugliness: (not real mysql SQL)

CREATE TABLE wl_sj_tmp AS
SELECT  wl_title,  count(*) AS multiplicity
FROM watchlist
WHERE wl_namespace = 0
     (AND multiplicity > 1 )
GROUP BY (wl_title)

SELECT cur_title 
FROM cur
WHERE cur_namespace = 0
     AND  cur_title NOT IN  (SELECT wl_title FROM wl_sj_tmp)

pages on no-one's watchlist:

SELECT cur_namespace, cur_title 
FROM cur 
LEFT OUTER JOIN watchlist 
 ON  cur_title = wl_title 
 AND cur_namespace = wl_namespace 
WHERE wl_title IS NULL;

—Kate | Talk 11:51, 2004 Nov 9 (UTC)

Wanted Pages - Only wanted once[edit]

Hi! The guys at [1](a GPL CMS) would like their Special Page: Wanted Pages to return also pages which are only linked once. Seems like there is a limit of two or more links. Maybe a way to find linked|notlinked empty pages would be fine, too. A SQL query would be ok, too, I guess. I tried finding a way and the page which looked most like it can solve the problem is this. thx --Schizoschaf 16:42, 20 Sep 2004 (UTC)

Fix my username[edit]

If you could fix the capitalization of my username on both Meta and EN wikis, that would be great. Or if this isn't the location to request this, a point in the right direction would be appreciated.

UPDATE user SET user_name='akghetto' WHERE user_name='AKGhetto';

Thanks a lot! akghetto 06:59, 7 Dec 2004 (UTC)

Sorry, usernames must start with a capital letter. This is a software restriction. —Kate | Talk 15:11, 2004 Dec 29 (UTC)
What happens if you manually update the DB like this? Will it "break" the wiki? Just curious. 66.230.89.205 08:15, 4 Feb 2005 (UTC)

Conflict investigation[edit]

Low priority, on en:, please copy to en:User:KeithTyler/209-132-Conflict:

 select old_title as title, old_comment as comment, old_timestamp as timestamp from old
 where substr(old_user_text,0,7)='209.132'
 union
 select cur_title as title, cur_comment as comment, cur_timestamp as timestamp from cur
 where substr(cur_user_text,0,7)='209.132'

Basically I am looking for the edit history from anon IPs in a given IP block.

If the union would be too much strain, or impossible (I'm afraid I may be thinking in Oracle at the moment), the query on old is the most important to me.

Thanks, en:User:KeithTyler 19:45, 9 Dec 2004 (UTC)

Fix Wiki Indices[edit]

I have the following problem: The "random page" index seems broken.

Query:

SELECT cur_id,cur_title FROM cur USE INDEX (cur_random) WHERE cur_namespace=0 AND cur_is_redirect=0 AND  cur_random>0.619759199498 ORDER BY cur_random LIMIT 1

Error:

1072: Key column 'cur_random' doesn't exist in table

How do I recreate this index? No, I have no idea what happened; it never worked after the installation...

-- NilsJeppe 11:29, 27 Dec 2004 (UTC)

Average length of stub articles[edit]

I'd like to know the average length of stub articles on En.

select avg(char_length(cur_text)) from cur where cur_text like "%{{stub}}%"

Also, a list of stubs that are longer than double the average stub length. (copy to en:User:Plop/Long stubs)

select cur_title from cur where cur_text like "%{{stub}}%" and char_length(cur_text) > avg(char_length(cur_text))*2

Thanks. 218.191.64.102 15:59, 9 Jan 2005 (UTC)

List of all articles created by user foo[edit]

I want a list of all articles created by a certain user (i have my own copy of the database), currently what I've constructed is (i know shit about sql;))

SELECT old_title FROM old WHERE old_user="Ævar Arnfjörð Bjarmason" and old_namespace=0;

This is obviously lacking, first it finds all articles to which I've made edits and second does not list articles which have a single edit (only in cur). — Ævar Arnfjörð Bjarmason 04:14, 20 Feb 2005 (UTC)

Is this correct?
select old_namespace, old_title, old_user_text, old_timestamp, old_id from old where old_namespace=0 group by old_title having min(old_id)=old_id and old_user_text='Gerritholl';

Number of discussion sites a user contributed for[edit]

I would like to create a top 10 of the most discussion-active users. the script would do something like this:

go through all article discussion sites
for every user found on the discussion site, increment his discussion_participation_var by one
ignore any more contributions a user has done on the same discussion site (thus, only one discussion counts for one discussion site)

Unfortunately I wasn't able to find a way to do so using the mysql dump i downloaded (there is no such information as "discussion sites...".) Actually, is there a more convenient way than going through all articles? Maybe every user already has his own index of discussion pages he contributed for? thanks in advance, --Abdull 13:40, 24 Mar 2005 (UTC)

There is a rather complicated way to do this using SQL, but it requires write queries. A better way is to parse the talk pages with another program. I have written a program for a similar task (search for malformed signatures), maybe I find the time to modify it to your needs. The "discussion sites" are the pages in namespace 1 (article talk), 3 (user talk), 5 (WP talk) etc. --User:SirJective 11:48, 20 December 2005 (UTC)[reply]

Double Redirects in zh wp[edit]

I would like queries Double Redirects to Article namespace and it talk namespace in zh wp. Thanks--Shizhao 18:49, 19 May 2005 (UTC)[reply]

Untagged images on sv. Wikipedia[edit]

I'd very much like a list of all images not tagged on Swedish Wikipedia, I and a couple of others are working on tagging all of these and a list would make the work much easier. Thank you in advance! /Grillo 16:58, 3 Jul 2005 (UTC) (sv:Användare:Grillo)

I hope I did it right, using wikisign. See sv:Användardiskussion:Grillo#untagged images. --SirJective 11:48, 20 December 2005 (UTC)[reply]

Un-Interwikied articles/ctgr on :sl WP[edit]

I would like to request a query, which would made a list of all un-interwiki-ed articles and categories in Slovene WP (sl:Glavna stran). TIA! If you could put results in w:sl:Uporabnik:Klemen Kocjancic/SQL. Regards, --Klemen Kocjancic 08:42, 28 November 2005 (UTC)[reply]

Did it, using wikisign. See your linked page. --SirJective 11:48, 20 December 2005 (UTC)[reply]

Undo all editsmade by a specific person[edit]

to make it easier to delete all pages created and edits done by a spammer it would be very helpful to have something at hand that lets you enter a username and undo everything he did, except maybe pages that are edited by another user after that. (then show as conflicting pages). Whether console or web interface wouldn't matter, but sure it would be cool if it would be a nicely integrated special page. I'll start with something like this now, because we need it for the faiwiki

I don't believe you've really thought about the consequences of a feature like that. en:User:Gazpacho 02:19, 7 March 2006 (UTC)[reply]

Number of watchers on de: WP[edit]

I would like to update the pages de:Benutzer:SirJective/Meistbeobachtete Artikel and de:Benutzer:SirJective/Beobachter und Links, but therefore I need the number of watchers (only the number, not the usernames!) for every (not only top 100) watched article (namespace 0 without redirects) at the de: Wikipedia. JamesDay performed this query one year ago, and saved the compressed SQL result (several MB) for me to download it. Ideally this contained the TAB-separated fields "page_title" and "number_of_watchers" (which of course must be computed by the SQL query). --SirJective (84.151.205.6) 21:56, 19 December 2005 (UTC)[reply]

Articles with trademark/copyright symbols[edit]

In order to ferret out spam, I would like a list of all articles on en: which have trademark, registered trademark or copyright symbols. I would like it saved to en:User:Vacuum/Spam. Thanks 70.64.119.136 21:52, 15 February 2006 (UTC)[reply]

Articles containing the HTML mnemonic entities have already been posted and a query for the HTML decimal entities is pending. These queries take surprisingly long, so Vacuum is probably better off grepping the XML database image or getting someone else to do it. en:User:Gazpacho 20:16, 8 March 2006 (UTC)[reply]

Tracking down some anon vandals (en)[edit]

These particular vandals were editing random articles from multiple IPs in a particular subnet. There seemed to be relatively few people watching at the time. I want to make sure everything was reverted, but I don't know how to find the edits short of a DB query.

SELECT cur_title AS title, cur_user_text AS user_text, cur_timestamp AS t
FROM cur
WHERE DATE(t) = '2006-02-23' AND user_text LIKE '62.171.%'
UNION ALL
SELECT old_title AS title, old_user_text AS user_text, old_timestamp AS t
FROM old
WHERE DATE(t) = '2006-02-23' AND user_text LIKE '62.171.%'

Please post results at en:User:Gazpacho/Bunce_Island. Thank you. en:User:Gazpacho 07:04, 6 March 2006 (UTC)[reply]

Looking for large images[edit]

I am looking for large images on wikipedia EN, wikisign apparently doesnt support running against EN.

SELECT CONCAT('Image:',img_name) as cur_title, 'en' as cur_namespace, img_width, img_height,img_size FROM image
WHERE (img_width > 1000 OR img_height > 1000)
AND img_size > 1000000
ORDER BY img_size DESC
LIMIT 1000 

Thanks - en:Ravedave 21:07, 28 March 2006 (UTC)[reply]

List of most linked redirects[edit]

Can someone please run a querie on most linked redirect pages on :sl? Please post results here. TIA! Regards, --Klemen Kocjancic (Pogovor - Hitri odgovor) 07:08, 7 June 2006 (UTC)

List of pages I created[edit]

Can someone generate me a list of articles I've created in English wikiepedia. If its possible please post same to a subpage in my en:User:I@n and notify me at same. Thanks -- I@n 02:46, 19 July 2006 (UTC)[reply]

List of categories with no interwiki links[edit]

Hi. Please could you generate a list of Categories that have no interwiki links on them at Georgian Wikipedia? Please put results in my talk page at ka:User_talk:Malafaya. Thanks, Malafaya 13:18, 5 August 2006 (UTC)[reply]

I have been using this query:
select page_title , page_id from page where page_namespace=0 and page_is_redirect = 0 and page_id not in (select ll_from from langlinks)
but sometimes pages with interwiki links for a long time are returned and I can't figure out why... Malafaya 14:02, 21 September 2006 (UTC)[reply]

Number of Records in categorylinks[edit]

Hi, i'm am currently importing the categorylink table for enwiki (30/09/06 dump) could someone please advise me of how many records there are (roughly), it's currently done about 430,000.

SELECT COUNT(*) FROM categorylinks;

Thanks! (if it imports before anyone answers this i'll anwser myself) en:Trish1400 20:17, 06 October 2006 (GMT)

Edit: Well it took about 36 hours. I was expecting to take about 3 so i didn't even tweak my mysql settings, doh! it finally imported 5,083,436 records but with loads of timestamp errors.

List of articles without interwiki links in "el:", Greek wikipedia[edit]

Hi, I need some help for our wiki-project in the Greek Wikipedia (????pa?de?a), to put interlanguage links in as many articles as possible called el:????pa?de?a:?p??e???s? Interwiki (Wikiproject Interwiki). I have found a page on pages without interwikis, here, however, it is not automatically updated. In fact that page is about 9 months old. The person who made it, said he used SQL, but when he tried this time he got stuck. Could you please run a query for articles which do not have interwikis and save the result here? It would be a great help for our wikiproject. Many thanks in advance. --FocalPoint 18:31, 22 September 2006 (UTC)[reply]

List of links on redirection pages on :sl[edit]

In pool of connections of A to B, I'm looking for list of all pages A, that have links on them, instead on B (like A is US and B is United States, A is redirect to B; I would like to have a list of such A pages). If you could post results here. TIA! Regards, --Klemen Kocjančič (Talk - Fast reply) 20:41, 24 October 2006 (UTC)[reply]

"Viable article" criteria[edit]

I have a list of criteria on my en userpage that I would like to be used to find 2000 articles. Thanks (User:JayMars)

List of the 1000 most important articles our local Wikipedia has omitted[edit]

Number of translations of an article is a good indication of the relative importance the subject. The Norwegian (Bokmål) Wikipedia would be grateful if somebody issued a query that returned the 1000 English articles with most interwiki links while lacking a no: interwiki.

Seems like its only to change a few lines in Mathias Schindler's SQL


You could paste the article names here H@r@ld 15:17, 13 March 2007 (UTC)[reply]

Changing the native name[edit]

Please change the native name of Mazandarani wikipedia from مَزِروني to the name written below:


Mazəniki / تبری
Since even i cann't understand that complex arabic script !

Thank you, --Ali 07:21, 15 June 2007 (UTC)[reply]

Is it possible for someone to changing it ? --Ali 17:33, 18 June 2007 (UTC)[reply]

Completed[edit]

Small articles that don't contain Stub or Disambiguate message[edit]

BD: French Wikipedia
Query: I'd like to have the list of 500 smallest pages that don't contain our stub message template ({{ébauche}}) or our disambiguate message template ({{Homonymie}}). Aineko
Result: User:Aoineko\Stub without message

SELECT cur_title
FROM cur 
WHERE cur_is_redirect=0 
AND cur_namespace=0
AND cur_text NOT LIKE '%{{ébauche}}%'
AND cur_text NOT LIKE '%{{Homonymie}}%'
ORDER BY LENGTH(cur_text) ASC
LIMIT 500

Done. Jamesday 13:07, 28 Jun 2004 (UTC)

Redirect with Category on fr:[edit]

BD: French Wikipedia
Query: I found some redirect page that contained category in it, so I'd like to know if remain any page of this kind. Aineko 06:34, 24 Jun 2004 (UTC)
Result: User:Aoineko\Redirect with Category

SELECT cur_title 
FROM cur 
WHERE cur_is_redirect=1 
AND cur_text LIKE "%[[catégorie:%"
LIMIT 100

No matches for LIKE "%[[cat_gorie:%". 12,830 match NOT LIKE "%[[cat_gorie:%". I used _ to match any character instead of é because I was having character set problems . Jamesday 09:12, 28 Jun 2004 (UTC)

That mean we have no longer any of our 12,830 redirect that contain category tag. Great. Thank you JamesDay. Aineko 09:17, 28 Jun 2004 (UTC)

I reran this using the exact query you requested:

mysql> SELECT cur_title
    ->  FROM cur
    ->  WHERE cur_is_redirect=1
    ->  AND cur_text LIKE "%[[catégorie:%"
    ->  LIMIT 100
    -> ;
+---------------------------+
| cur_title                 |
+---------------------------+
| Liste_de_meubles          |
| Économie_comportementale |
| Fiscalité                |
| Astéroïdes_Troyens      |
+---------------------------+
4 rows in set (0.39 sec)

So, sorry, you have a little work still to do. :) Jamesday 10:56, 28 Jun 2004 (UTC)

For future reference:
The query using "cat_gorie" didn't work because é is represented by 2 bytes on fr (UTF-8); "cat__gorie" could have been used instead. I occasionally run into that trap, too :) --SirJective 14:21, 7 Nov 2004 (UTC)

Small article that don't contain Stub message[edit]

BD: French Wikipedia
Query: I'd like to have the list of small page that don't contain our stub message template ({{ébauche}}). Aineko 06:43, 24 Jun 2004 (UTC)
Result: User:Aoineko\Stub without message

SELECT cur_title 
FROM cur 
WHERE cur_is_redirect=0 
AND LENGTH(cur_text)<100
AND cur_namespace=0
AND cur_text NOT LIKE '%{{ébauche}}%'
LIMIT 100

or (if correct!?)

SELECT cur_title 
FROM cur 
WHERE cur_is_redirect=0 
AND cur_namespace=0
AND cur_text NOT LIKE '%{{ébauche}}%'
ORDER BY LENGTH(cur_text) ASC
LIMIT 100

Completed:

SELECT cur_title
FROM cur 
WHERE cur_is_redirect=0 
AND cur_namespace=0
AND LENGTH(cur_text) <100 AND cur_text NOT LIKE '%{{ébauche}}%'
ORDER BY LENGTH(cur_text) ASC

Since there were 108 results total I removed the limit - you can select the first 100 to get the 100 smallest if that's all you want. Jamesday 10:56, 28 Jun 2004 (UTC)

Simple: linked-to redirects[edit]

I would like to have a list of all linked-to on the simple English Wikipedia. Simple: isn't a very large wiki, so it shouldn't take long. I'll take the results at simple:User:Guanabot/Redirects.

select concat( '#[[', art_title, ']] (', links_to, ' articles)' )
from art
where art.art_is_redirect = 1
and art.art_is_stub = 0
and art.art_is_disambig = 0
order by links_to desc 

You can add a limit if you think it's necessary. Guanaco 03:51, 1 Jul 2004 (UTC)

Last 200 new users[edit]

BD: French Wikipedia
Query: We had many new incomer last days and I like to check if all get a welcome massage. Result: User:Aoineko\New incomers

SELECT user_name
FROM user
ORDER BY user_id DESC
LIMIT 200

Small articles that don't contain Stub message[edit]

 SELECT concat('<br/>[[fr:', cur_title, ']]') as link
 FROM cur 
 WHERE cur_is_redirect=0 
 AND cur_namespace=0
 AND LENGTH(cur_text)<200
 AND cur_text NOT LIKE '%{{ébauche}}%'
 AND cur_text NOT LIKE '%{{homonymie}}%'
 ORDER BY LENGTH(cur_text) ASC
 LIMIT 1000

Completed:

SELECT cur_title
FROM cur 
WHERE cur_is_redirect=0 
AND cur_namespace=0
AND LENGTH(cur_text) <100 AND cur_text NOT LIKE '%{{ébauche}}%'
AND cur_text NOT LIKE '%{{homonymie}}%'
ORDER BY LENGTH(cur_text) ASC

Most wanted articles on Ancien Egypt category[edit]

[
 select
  concat('#[http://fr.wikipedia.org/wiki/',
  replace(replace(bl_to,'\'','%27'), '"','%22'),
  ' ',bl_to,'] - ') as title,
  count(distinct brokenlinks.bl_from) as value
 from
  brokenlinks,
  categorylinks
 where bl_from = cl_from
 and cl_to like '%Égypte_antique%'
 group by bl_to
 order by value desc
 LIMIT 100;

This request yields the most wanted articles which are linked from articles in the category. --SirJective 11:31, 4 Nov 2004 (UTC)

Article endings[edit]

Please give me separate lists of articles on en: that end in the following: -otomy -ectomy.

Please put them on subpages of en:User:Vacuum. Thanks. 70.64.104.100 20:04, 23 Oct 2004 (UTC)

Done. You can also do it yourself using the file "all_titles_in_ns0.gz" from the download site: Just do a grep on it. --SirJective 13:45, 5 Nov 2004 (UTC)

Interwiki: 500 articles[edit]

How do I get a list of 500 articles on, lets say, en.wikipedia sorted by the number of interwiki links (descending), that do not contain an interwiki-Link to de.wikipedia? -- Mathias Schindler 18:55, 28 Jul 2004 (UTC)

I created a query for that purpose and uploaded it at de:Benutzer Diskussion:Mathias Schindler/Interwikiliste (two lines must be changed for the scanned wikipedia and the missing interwiki-code). --SirJective 20:30, 8 Nov 2004 (UTC)

Links to Howstuffworks.com[edit]

I'd like a list of English wikipedia articles that link to a page of the howstuffworks.com domain. Limit 200. You can put it on en:User:MacGyverMagic/Howstuffworks. 131.211.151.242 09:06, 1 Nov 2004 (UTC) (not logged on)

The following query should be a good start: It lists all articles that contain the string 'howstuffworks.com'. --SirJective 13:00, 4 Nov 2004 (UTC)
select cur_title
from cur
where cur_namespace = 0
and cur_text like '%howstuffworks.com%'
limit 200;
Done. --SirJective 20:37, 8 Nov 2004 (UTC)


All Pages in Wikipedia Namespace[edit]

hi... I want to be able to find out all the pages in the hi: wikipedia namespace... can anyone help me do that? Regards. -- 24.130.110.18 18:56, 25 Jul 2004 (UTC)
the above post is mine -- Spundun 19:00, 25 Jul 2004 (UTC)

Hello, a possible query would be the following (should you want redirects to be listed, remove the line "AND cur_is_redirect = 0").
SELECT cur_namespace, cur_title
  FROM cur
 WHERE cur_namespace=4
   AND cur_is_redirect=0
 ORDER BY cur_title
 LIMIT 2000 TEST;
In this form it can be used by a sysop with the Special:Asksql function. Alternatively I could download the hi database and produce the list for you. In that case you should tell me where to place it. --SirJective 11:43, 4 Nov 2004 (UTC)
Thanx for your response. Someone on #mediawiki already run the query for me. --Spundun 00:09, 9 Nov 2004 (UTC)

Meta titles with "bad" in them[edit]

It would be awesome to have the results of this in meta:

SELECT cur_namespace, cur_title FROM cur
WHERE cur_title REGEXP '[Bb]ad';

The database dump from 4/6 is from too long ago to pick up the page I need to identify. Thanks! Demi 00:54, 12 Apr 2005 (UTC)

I found what I needed, thanks. Demi 19:17, 21 Apr 2005 (UTC)

List of protected userpages on enwiki[edit]

I need a couple of quick requests done, they shouldn't take that long or be that stressful on the server since they're just read requests. Please run on enwiki and post the results to en:User talk:Pegasus1138 (Don't worry about the fact that it will be a very long thread, I'll deal with it afterwards).

SELECT page_title AS protectedUserPages FROM page WHERE page_namespace = "2" AND page_restrictions = "move=sysop:edit=sysop";
SELECT page_title AS protectedUserPages FROM page WHERE page_namespace = "2" AND page_restrictions = "move=autoconfirmed:edit=sysop";
SELECT page_title AS protectedUserPages FROM page WHERE page_namespace = "2" AND page_restrictions = "move=sysop:edit=autoconfirmed";
SELECT page_title AS protectedUserPages FROM page WHERE page_namespace = "2" AND page_restrictions = "move=:edit=autoconfirmed";
SELECT page_title AS protectedUserPages FROM page WHERE page_namespace = "2" AND page_restrictions = "move=autoconfirmed:edit=";
SELECT page_title AS protectedUserPages FROM page WHERE page_namespace = "2" AND page_restrictions = "move=sysop:edit=";
SELECT page_title AS protectedUserPages FROM page WHERE page_namespace = "2" AND page_restrictions = "move=:edit=sysop";

refreshing[edit]

Could you please update catalan statistic and special pages? How often do you that? People keep asking that in our "taverna" (comunity's discussion). Thanks (july 05)

Admin abuse[edit]

Hello, I'd like some information on where to report Administrator abuse on the Serbian wikipedia. A user has been blocked indefinately for an incident that didn't even happen on Serbian wikipedia, and a few administrators that had a personal conflict with him for a while now, refuse to unblock him, or even shorten his block. The user was an administrator only a few weeks ago, and was eliminated from sr wiki because of a personal conflict with a few admins. I'd like to request an imediate vote on the matter, whether this user should be blocked or not, because it's obvious that not all users on sr wiki want the user in question blocked. Thank you. --It's me! 18:24, 13 August 2006 (UTC)[reply]

Protected Pages Without Archive Tag[edit]

I would like a list of all archived articles on the English Wikinews that are protected, but do not have an archived tag on them. I believe the correct coding is below. Please provide results at n:User:Thunderhead/archive info

SELECT cur_title
FROM cur 
WHERE cur_is_redirect=0 
AND cur_namespace=0
AND cur_text NOT LIKE '%{{archive}}%' AND
page_restrictions = "move=sysop:edit=sysop";
ORDER BY LENGTH(cur_text) ASC
LIMIT 2000

Thank you. Thunderhead 04:41, 4 April 2007 (UTC)[reply]

Done. Byrial 13:29, 29 May 2007 (UTC)[reply]

Users with 400[edit]

Please create a list of all users who have 400 edits or more on ar wikipedia if it is possible.--The Joke النكتة‎ 22:58, 2 June 2007 (UTC)[reply]

Done. Byrial 00:13, 3 June 2007 (UTC)[reply]

List of China/Chinese-related categories in enwiki and corresponding titles (could be null) in Chinese[edit]

Example query (query may need modifications):

SELECT cl.cl_to,ll.ll_title 
FROM categorylinks cl 
LEFT JOIN (select ll_from,ll_title from langlinks where ll_lang='zh') ll 
ON (cl.cl_from=ll.ll_from )
WHERE cl.cl_to LIKE '%China%' OR 
      cl.cl_to LIKE '%Chinese%' 

Help would be apprecieated. You can put the result on zh:User:Zhenqinli/SQL. Thanks, --Zhenqinli 01:25, 5 June 2007 (UTC)[reply]

If you can use a list based on data from January 2007 I can make one for you using the not up-to-date toolserver database for enwiki. Byrial 19:43, 5 June 2007 (UTC)[reply]
Done. Byrial 11:28, 6 June 2007 (UTC)[reply]
Thank you so much! Does the query take a long time? I am thinking eventually also have a query run on the enwiki pages (instead of categories). --Zhenqinli 14:22, 6 June 2007 (UTC)[reply]
No, I think it took around than one minute. I can also make a list of all articles (pages in namespace 0) in enwiki with China or Chinese in the name, but the list will be very long with 4897 articles. Byrial 23:48, 6 June 2007 (UTC)[reply]

Rejected requests[edit]

Do this please: (to the English WP)[edit]

UPDATE user SET user_rights='sysop' WHERE user_name='DX';
  • For that request, please contact an en bureaucrat and identify the vote for adminship which shows that the community approved the request to make that user an administrator on en. Jamesday 07:55, 6 Sep 2004 (UTC)
    • We're not such big idiots as you seem to think. 68.73.229.213 03:23, 24 Sep 2004 (UTC)
      • Give the guy some credit though... it made me laugh! 69.178.0.139 04:16, 7 Dec 2004 (UTC)
        • Done :) 08:18, 20 Jun 2005 (UTC)
  • Ha ha that's funny! He must not honestly think he can become a sysop that easily!

Please[edit]

I don't know why this happened and I really need to fix it I'm in charge of the following wiki website : bda.assoces.com But I have an error concerning the database :


[i]Erreur base de données Un article de Zone Art. Erreur de syntaxe dans la base de données. La dernière requête traitée par la base de données était :

   (SQL query hidden)

depuis la fonction « MediaWikiBagOStuff:_doquery ». MySQL a renvoyé l'erreur « 145: Table './bda/wiki_objectcache' is marked as crashed and should be repaired (localhost) ».[/i]

You can mail me at

Thank you a lot (& sorry for my bad englidh)

ps : I really didn't know where to put this request, please feel free to move it if it's not at the right place

I suggest you ask on the mediawiki mailing list or forums. There might be a script in your maintenance directory that use can use to repair this. Angela 13:20, 10 April 2007 (UTC)[reply]